hacker news with inline top comments    .. more ..    16 May 2016 Best
home   ask   best   2 years ago   
Introducing unlimited private repositories github.com
1571 points by fuzionmonkey  4 days ago   632 comments top 120
hunvreus 4 days ago 24 replies      
1. Take a gazillion dollars in funding on an over-hyped valuation,

2. Go through significant organizational changes that end up with the departure of a co-founder (and more suits in the building).

3. Notice that a significant segment of your growth (VC-funded startups) are running out of money.

4. Switch to a user-based pricing to generate more revenue for investors, but spin it as a freebie "Hey! Look at the cool unlimited shit! No, no! Don't pay attention to the fact you're gonna be charged 3 times as much as before for the same service".

The bottom line is that GitHub is free to do whatever the heck they want; if they believe that charging per user is going to make more (financial) sense to them, then they can go ahead and do it.

But I'd appreciate if their PR department didn't expect us to swallow this as a positive change. Most coders understand basic maths.

arnvald 4 days ago 22 replies      
A small comparison:

Team | Cost Before | Cost Now

1 repo, 5 users | $25 | $25

1 repo, 10 users | $25 | $70

11 repos, 5 users | $50 | $25

11 repos, 10 users | $50 | $70

5 repos, 50 users | $25 | $430

50 repos, 5 users | $100 | $25

50 repos, 50 users | $100 | $430

I'm not sure how common are organizations with few users and large number of repose - I guess software houses that keep old projects (for maintenance and future requests from clients) fall into this category, but who else?

The other case where it becomes cheaper is personal accounts.

In all the other cases - it just looks like a raise of prices.

beberlei 4 days ago 2 replies      
The incentive changes for this are so massive, nice "experiment" from an economics perspective.

1. penalizes OpenSource organizations that need a few private repos for password, server configuration or other things. Was 25$ before, now for example Doctrine with 48 collaborators it would be 394$. Even if just the admins have access to that repository.

2. penalizes collaboration, inviting every non-technical person in the company? 2-5 employees of the customer? not really. Will lead organizations to create a single "non-technical" user that everyone can use to comment on stuff. not to mention bots, especially since you need users for servers in more complex deployment scenarios.

3. rewards having many repos, small throw away stuff and generally will lead to "messy" repositories lying around everywhere that are committed on once or twice and never touched again. "Not having to think about another private repository", imho will produce technical debt for organizations.

4. users in many private orgs will need to pay or get paid for every organization each. I myself will be worth 45$ now for Github, being in private repositories of five different companies.

All in all, this just shows that Github does not care as much about open source anymore as it cares about Enterprise.

Btw: Mentioning the price jumps in repository usage of the old pricing is not really helpful. Consider a pricing that would be per repository (1$ for personal, 2$ for organizations) and doesnt have jumps and compare that to the new per using pricing. The new pricing only feels better for some, because you pay marginal costs for every single user instead of the old pricing where every 50 repositories you have to suddenly pay 100$ extra.

Edit: Forgot about bots, and deployment machine users (which even Github recommends for many scenarios)

gelatocar 4 days ago 4 replies      
What about companies like Epic Games that have few repos but many users?

With their 2 private UnrealEngine and UnrealTournament repos they would have been paying $25 a month and under the new pricing structure will have to pay $815,913 per month...

edit: That's based on what I can see as a UE4 subscriber, 2 private repos and 90657 users.

biztos 4 days ago 7 replies      
I find it interesting that so many people here are unhappy with the change. Sure, prices will go up for a lot of organizations, but is $9/worker/month really a lot to pay for all the stuff GitHub offers? At Bay Area prices isn't that about 5 minutes of developer pay per month?

For independent use it seems like a very positive change, in fact I'm guessing it's a direct challenge to GitLab. I was considering moving my stuff to GitLab simply because I'm tired of bundling experiments/prototypes into umbrella repos just to stay under the 10 repo limit at GitHub. For people like me this will be awesome, and I take it as a good sign that they're responding to the competition.

One thing I don't get however: how do they count shared access to private repos?

If I have a private repo and you have a private repo, and we each grant access to the other's repo so we can collaborate, do we now have two or four billing units?

They say "you can even invite a few collaborators" -- but how are you billed if it's more than a "few?"

I don't mind if they try to close the loophole of making up an "organization" out of a lot of "individual developers" but it seems a little vague.

grawlinson 4 days ago 2 replies      
That's cool but seeing as Bitbucket has unlimited private repos for everyone, I'll be sticking with Bitbucket for private trash and Github for public trash.
sudhirj 4 days ago 2 replies      
What's with all the negativity? This is really good pricing - all individuals now pay much less (a flat rate of $7), all small shops pay almost the same thing ($30 to $90 for 3 to 10 people). Both groups no longer need to think twice about creating repos, which has always been a huge pain that I've seen. I've even thought twice about microservices because the repo cost would be a pain.

This will affect enterprises - but then they're either already on Github Enterprise or are used to per user pricing anyway. Google Apps, Slack etc all have (quantitavely similar) per user pricing. Google doesn't charge you based on the number of emails you send, nor does Slack charge based on the number of private rooms there are - that would be dumb.

The band of companies between small shops and enterprises are likely to be affected, but then this is really employee lunch money.

rspeer 4 days ago 1 reply      
This is, of course, a positive way to spin the fact that they're raising prices significantly for many organizations.

I'm glad there's at least a year that we can keep using the old plans.

bsnape 4 days ago 3 replies      
This has almost quadrupled our monthly cost ($850 vs $2914). We have ~300 users which will have to be reduced massively to save costs - perhaps with non-engineers sharing accounts or having no access at all. I'm not sure if charging per user is really in the spirit of open collaboration that GitHub champions.

I slo wonder if charging per user rather than per repo will also discourage the creation of open-source repos from orgs? There's no longer a (reduced) cost benefit after all, even if that was a minor influence compared with the other benefits of open-sourcing your code.

0xmohit 4 days ago 1 reply      
With this change, BitBucket pricing [0] gets to appear pretty attractive.

(If you were an organization with few private repositories and large number of users, Github was earlier more affordable.)

[0] https://bitbucket.org/product/pricing?tab=cloud-pricing

kapv89 4 days ago 2 replies      
Nothing beats https://bitbucket.org/ when it comes to free, unlimited, private repositories. It has seen the first hosted repositories of far more startups than github ever will. Which is special achievement in itself.
romanovcode 4 days ago 4 replies      
I see absolutely no reason why one would pay GitHub for private repositories when there is Bitbucket, or much better alternative to GitHub altogether - GitLab.
bufordsharkley 4 days ago 3 replies      
Have been using Github for a community radio station, have been encouraging all staffers to use github accounts to file issues against our private repos, etc. The friendly policies for many collaborators have made this attractive, even though most users have rarely interacted with the repos, if at all.

Now each user for the private repo has a significant cost (pretty significant for a non-profit community radio station); looks like we'll have to rethink this whole Github thing.

therealmarv 4 days ago 4 replies      
It seems most users here don't have gitlab.com in their radar and only mentioning Bitbucket as competitor. I've recently switched all my private personal repos to gitlab.com which also allows unlimited private repositories because gitlab.com seems to have better UI and more features than Bitbucket (when not buying any additional Atlassian Jira etc. products).
t3nary 4 days ago 3 replies      
Does anyone know if this will effect student plans as well? So far it included a free micro plan with the usual 5 private repos. Would be pretty awesome, I just had to host a repo somewhere else a few days ago because I ran out of private repos.

Other than that it sounds like a great improvement, it'll make it a lot more likely that I'll pay for GitHub when I'm not a student anymore.

/edit: https://github.com/pricing makes it sound like this is for free student plans as well

m4tthumphrey 4 days ago 2 replies      
I find it quite hard to comprehend why people use Github for private repositories. There are many free alternatives. BitBucket seems to be the famous one, but Gitlab has grown into an amazing product with 3 different offerings; On premise community edition, on premise enterprise and hosted (like Github).

We have used the on premise community edition for about 3 years now. I first installed it when you had to run about a billion commands manually and it was great even then. Now you can install it with an apt-get and a few lines.

Lets not forget about the obvious negatives of Github (ignoring pricing).

1) Its hosted which means it can go down2) It is closed source3) Feature based is quite small (compared to Gitlab)

Gitlab is a regular release cycle, once a month which always comes with new features.

I personally think it is a no brainer.

ThePhysicist 4 days ago 0 replies      
It would be interesting to know how many users and repositories a typical organization has on Github.

To me, it looks like they're just "optimizing" their pricing, as I would guess that most large organizations using Github have significantly more users than repositories, especially with the recent trend towards "mono-repositories".

That said, SaaS pricing is really hard to get right from the beginning. I run a code analysis company (https://www.quantifiedcode.com) and we thought a lot about which kind of pricing would be the best for us and our users (we decided to use per-repo pricing). In the end, your pricing needs to support your business model, so it's normal to change it especially if you have a lot of data on how your users use your product.

I wonder though if this will drive organizations to other solutions like Gitlab or Bitbucket, as those are significantly cheaper and pretty easy to set up these days (and you get the extra benefit of a self-hosted solution that can be hosted in your own, secure infrastructure)

rdancer 4 days ago 0 replies      
This is an awful pricing model.

One-size-fits-all never fits all. Getting rid of tiers is nave and misguided. Even if just for anchoring and the illusion of choice in face of terrible choices, tiers are a necessity. Sales will suffer, customer satisfaction will suffer.

I don't care if existing private customers pay the same or less. The price points should have been retained, and customers let to switch to a lower tier if they wished. Capturing consumer surplus leads to increased revenue. Github needs that money; the more money they throw away foolishly, the closer they are to bankruptcy.

"Starting today"?! At least current developer plans have been grandfathered in, with a 12-month notice period. Still, if an org has been in the process of planning a move to Github, they will have to re-evaluate.

Github has been such a great platform. A major stumble like this, I'm worried they may not be with us for much longer.

patcon 4 days ago 0 replies      
This is absolutely fucking atrocious news for any company who wants to run an agile operation.

I always framed the "Github vs Bitbucket" as an "agile vs enterprise" mentality -- BitBucket made you think hard about adding new people, and air on the side of limiting access -- ie. conceal by default. That's perfect for enterprise, but the worst fucking incentive ever for an org that wants to make as many projects as possible accessible to all company members. GitHub (in times past), removed this cognitive burden of thinking "does this person /really/ need access....?" -- ie. transparent by default.

But now they've fucked up.

I was always in favour of avoiding self-hosting when there was a great hosted service like GitHub available. But I would now never advise any company that I cared about to use GitHub. It will contort and twist the openness you wish to imbue in your growing company

kuon 4 days ago 1 reply      
Now I have to pay for external collaborators? Are you kidding me? We are a small team of 5, but making softwares for other, I'll have to move away from github with the new pricing, we have nearly ten people per repository that might just be exec who never accessed the repo but must have access to it.
tyingq 4 days ago 0 replies      
If you happen to be a group that will be affected negatively by this move because you have a need for read-only users...

Gogs has mirror functionality where you could self-host access for those users in a fairly painless way. Screenshot of import screen: http://i.imgur.com/J4vWCIB.png

More on gogs here: https://github.com/gogits/gogs

(no association with gogs, just thought it might be helpful)

pilif 4 days ago 1 reply      
The linked page is telling us that eventually, only the new plans will be available. For my case (15 users in the organization, using the bronze plan with a lot of not-so-important repos on our own server), this will be a price increase from $300/y to $1380/y - nearly 5x more expensive.

I really hope the old plans stay around as long as possible.

Also, consider external collaborators that are part of multiple organizations: Github will now receive the $9/month per external collaborator and organization they are in. That's one hell of a deal for github.

lox 4 days ago 4 replies      
Pretty angry that Github have made this change with no mechanism for adding machine users without paying a per month charge. It seems like a key feature, which is currently horribly painful to manage and now expensive.

How does everyone else create credentials that CI can use to checkout code?

caseymarquis 4 days ago 1 reply      
The number of very small teams or individuals this encourages to start using github probably allows every organization who can't afford this to leave and github to still increase the money they're making. It seems like a good move based on my imagined profile of their user base. 1 million teens and young 20-somethings just decided they'll give 7$ a month to github.

For bigger organizations, this is practically no money compared to other software they're using. So they'll just take the hit.

Sounds like the only customers being lost were those using github for no-commit users. Is that really a huge segment? If so they just need a special account status to fix this.

I think the question is why this took so long.

stephenr 4 days ago 0 replies      
Hopefully this opens the eyes of at least some people into realising that GitHib !== git, and GitHub !== dvcs (similarly, git !== dvcs). There are several alternatives out there, almost all of which provide more options at lower cost than GitHub.

I know, I know "everyone is familiar with github". If your developers can't function without GitHub specifically, you have a bigger problem than the new GitHub pricing.

AndrewGaspar 4 days ago 0 replies      
I'm glad. Occasionally I would delete abandoned projects to make space and now they can live forever to remind me of my failure!
BradRuderman 4 days ago 2 replies      
Its unfortunate that this doesn't promote trying to get business users to look at the code. In our organization 3 or 4 users are read only and really just go in at times to check specific errors, or logic for certain SQL queries, they don't really contribute. We will now have to pay $9 per month for these type of "read only" users.
Ghostium 4 days ago 1 reply      
Hmm, I still will use Gitlab instead of Github. Unlimited public and private repos for free is nice.
StevePerkins 4 days ago 1 reply      
TL;DR - GitHub is switching to Bitbucket's pricing model, but with a monthly charge of $9/user rather than $1/user.

Seems bizarre to me. The "enterprise" market they're chasing are largely Atlassian customers already, and Bitbucket has a competitive edge there with its JIRA integration. GitHub's distinguishing characteristic was a different pricing model, that for some organizations makes more sense than Atlassian's does.

If they start competing apples-to-apples, but at 9x the cost, why would any enterprise use GitHub unless they have a hipster CIO/CTO who just thinks it's a "cooler" brand?

n9com 4 days ago 1 reply      
This change worked out well for us. Gone from paying $200/month to just $25/month for our 5 person organisation.
ACow_Adonis 4 days ago 1 reply      
As a solo developer who had currently paid up for monthly access annually, I feel obligated to feed back that this is pretty good news for me. Go github.

The 5 private repositories was a bit grating and making me considering a move elsewhere. I was going to have to consider changing how I stored/structured my projects in order to stay under what seemed to me to a relatively arbitrary limit, which interfered with some of my automated tools and how I'd set them up to assume a separate repository for each project.

I realise there are a number of bigger organisations for whom this realistically means a hike in prices, and I'm winning relative to their losing, but as someone who wants to keep advantages to the little guys (that's the genuinely little guys, not a bunch of 50-100 guys bankrolled by several SV millionaires/billionaires)...well, I feel its my duty to weigh in with positive feedback against what is probably going to be some negativity from the bigger guys...

nateguchi 4 days ago 3 replies      
I'm sure a lot of people will be moving from Bitbucket to this, Bitbucket's plans were great for hundreds of repos, but Github's ecosystem is definitely preferable.
ismyrnow 4 days ago 0 replies      
Github is... adopting the old Visual Studio logo?


Cozumel 4 days ago 1 reply      
'unlimited private repos' if you pay. BitBucket gives you them free and always has!
jamies888888 4 days ago 0 replies      
Very cleverly worded to sound like a price reduction when it's actually a price increase.
xchaotic 4 days ago 1 reply      
So what makes them think that they can get away with it? There's already decent competitors - GitLab, BitBucket, Azure or you can just host your own git repos - gitlab will even give you a nice Web UI for it.Why do they think that people with stick with github, if we're talking $thousands/year then surely migrating to another git repo provider is worth it?
mattyohe 4 days ago 1 reply      
All I ask is that Github implement Slack's Fair Billing Policy. Managing who at the organization can access a service is a silly task.

Unfortunately, it doesn't appear that this follows that model. They're open to feedback: https://github.com/contact

voltagex_ 4 days ago 1 reply      
Is there a way to get billed annually for a personal account? Makes budgeting easier and also protects me against AUD/USD changes.
red_admiral 4 days ago 2 replies      
For small private projects, gitlab.com has had unlimited private repos for $0/month for a while now.
partycoder 4 days ago 4 replies      
I am strongly considering moving to gitlab.
drinchev 4 days ago 0 replies      
Wow. Companies definitely suffer. For me ( freelancing dev, working primarily with startups ) it's a huge win.

GitHub vs BitBucket was always about :

1) 3rd party integrations ( CircleCI - e.g. ) - sadly bitbucket is behind that.

2) Issue management. Bitbucket's default behavior doesn't support labels or any other way of managing the issues structure.

Now, honestly CircleCI + GitHub for 7$ is just extremely cheap. ( talking solo devs / small teams ).

nikolay 4 days ago 1 reply      
This is way too expensive! Self-hosted GitLab is cheaper and has better uptime!

Not to mention, they should have made you pay only for users with commit rights!

andrewljohnson 4 days ago 0 replies      
How many startups have a non-core-dev-advisor who they will now pay $9/month to get occasional comments from? Or not?

One downside of this change is if you have a private Github org, you are now incentivized not to add advisors/randoms to your org/repos. I wonder how much scurrying Github sees to remove errant users from orgs.

discodave 4 days ago 1 reply      
For comparison, quoting from the AWS CodeCommit pricing page...

AWS CodeCommit costs:

$1 per active user per monthFor every active user, your account receives for that month:

10 GB-month of storage

2,000 Git requests

And the 1 year free tier is:

5 active users50 GB-month of storage10,000 Git requests

imron 4 days ago 1 reply      
The main image on that page looks remarkably similar to the 2010 Visual Studio logo:


konole 4 days ago 0 replies      
alanfranzoni 4 days ago 3 replies      
Do outside collaborators count as paid users?
imcotton 4 days ago 0 replies      
GitLab gets 1 point without doing anything, oddly.
throwaway2016a 4 days ago 0 replies      
This change actually saved me a lot of money per month. We use micro-service architecture and furthermore do consulting work so we had a Platinum level plan with only 7 people with access. This greatly improved our billing situation.

Although I can very much see how it could go the other way.

sandGorgon 4 days ago 0 replies      
Thank you - this is very exciting. Bitbucket uses a per-user pricing and it has been extremely useful for us. People forget how useful it has been to not worry about the number of scratch repositories we can create as we experiment.Our main repo is a single monolithic repo. But do you not ask your consultants/outside resources to work on a company repo ? how do you price that I wonder.

I am not sure why people would like stuff to be priced per repo. It is a fairly unintuitive model for me and is a huge problem when you need to go an explain to the finance team that you need to spend more because you "created more repos"... say wut? Spending per user is a very clean way of pricing.

timvdalen 4 days ago 0 replies      
While unlimited private repositories sounds good, this change means that our GitHub costs are now 2.3x higher.

If this is going to be enforced, we'll need to decide between cutting away users from the org or moving to a different platform.

shrugger 4 days ago 0 replies      
But why should I use Github over Gitlab? I don't care about popularity, Gitlab already offers the minimal set of features I care about, and has demonstrated a neutral business model.

Github had leaks coming out about how 'white men' aren't suitable to solve GH's business problems, why should I want to associate with an organization that discriminates people based on the color of their skin rather than by the contents of their code?

I'm glad that they are offering this, I think their customers will put this offering to good use, but it doesn't convince me.

tanepiper 4 days ago 1 reply      
There is an element of "double dipping" here that I see as a problem.

I already pay $7 a month for my own personal Github account, and for me personally it's nice to have no limit.

But if we switch to the new model at work then not only am I paying my $7, but my company will have to pay an additional $9p/m for me to have access to the repos I use daily for work.

Even if they removed me from the organisation and added me as a collaborator this will be an additional cost.

They can spin it how they like but I suspect for a large number of organisations they are going to see quite an increase in cost from using Github.

hartror 4 days ago 0 replies      
Finally! While as others point out this can work out more expensive the improvement is is that it scales as my company scales and doesn't act as a disincentive to developers spinning up new repos.
spriggan3 4 days ago 1 reply      
The pricing is clearly designed to make more revenue from businesses with a lot of users, which makes sense for Github but not for big teams in /mid sized shops who will be paying a lot more.
oneeyedpigeon 4 days ago 0 replies      
Misleading headline alert; should say "for paid accounts" I got way too excited there :-(
petetnt 4 days ago 0 replies      
Links not up yet, but you can already switch your plan https://github.com/organizations/your_org/settings/billing/p... for unlimited private repositories at $25/month for your first 5 users. $9/month for each additional user. (Edit: Up now! Personal plans get upgraded to unlimited too!)
CiPHPerCoder 4 days ago 0 replies      
This change was beneficial to me.

Before upgrading, a grand total of 4 users had access to our private repositories, of which we were only using 7 out of 10. I was nervous about running out of repositories moreso than the cost of adding people.

(If we grow our team, it's because we have a lot of client work that's outside my immediate strong suits and we had to hire. If we do that twice, I'll gladly pay the extra $9/month.)

samstave 4 days ago 0 replies      
Well, I will say that this is a good thing, because when we had paid for ~20 repos at a last company, and eng made a new repo - number 21, that was then made public by default as we were out of private repos. FUCK THAT.

He made a mistake and checked in (yes this is on him) an AWS access and secret.

Within hours we have 1,500 machines launched in every region doing bitcoin mining....

Making a repo "public by default" is pure BS.

mark_l_watson 4 days ago 0 replies      
This will help organizations that keep huge monolithic repos on GH - one of my customers does that. They have one repo that should be dozens of smaller repos.

I use GH for my open source projects and code examples for my books and I use Bitbucket (which is also a great service) for my private repos. I have always felt somewhat guilty with this setup, working both companies for free services.

dblock 4 days ago 0 replies      
If someone is unsure about this math, our (we're https://github.com/artsy) bill goes up from 450$ to 1051$ per month.

But it's not about just the money, it's about incentives.

- We have large amounts of open-source code, so we were encouraged to open-source more to avoid jumping to the next tier.

- We're going to probably close access to a bunch of code to a big chunk of our organization. We have hundreds of humans. Whereas before we would give them permissions to view as a default and hope they look at our code one day or at least know that they can, or sometimes would get a link to look at a change from a discussion, we'll now have to have to see whether it's worth 9x100s of people every month.

I am not complaining, Github provides excellent service. Seems worth it at 5K$ a year and probably 10K$ a year, too. I wish it didn't just double though and was more gradual.

danpalmer 4 days ago 0 replies      
I think the new pricing structure makes a lot of sense, however is awkwardly limiting in some respects that GitHub might not have considered.

We have essentially 2 classes of GitHub user on our organisation - developers, and non-developers. While our devs use GitHub all the time (and therefore are worth the $25 a month for the development team), our other users might edit a specific few config files, or jobs pages (for example) once a month - paying $9/month each seems quite overpriced.

We want to be an open company, one that doesn't keep secrets from employees, one that doesn't create unnecessary barriers to productivity, or have unnecessary process, so giving GitHub access to everyone in the company who wants it is important to us - this stops us from reasonably doing this. As a result, we likely won't be switching to the new pricing structure for as long as possible, which is a shame, because it would be nice to not have to think about private repos.

piyush_soni 4 days ago 0 replies      
Still, not even a couple of free private repositories?
erikrothoff 4 days ago 0 replies      
This is awesome! I'm currently paying 50 USD per month for more repos on my private account. Definitely the right way to go.
shepbook 4 days ago 0 replies      
I think this is a clear win for individual users that have been paying for GitHub. For organizations, I'm curious how many organizations they have just bumped above the $300-500/month mark. A lot of companies allow managers discretionary spending limits that they can spend without requesting approval, and if makes me wonder if they just made a bunch of managers need to start asking for approval for their GitHub bill. Another comment mentioned that having it filter up that the cost of a service just increase several times, will likely result in people being told to investigate alternatives. If that's the case, there are a fair number of alternatives to go to, depending on your specific situation.
joeblau 4 days ago 0 replies      
I just had a discussion with my buddy about repositories yesterday. He wanted access to some code I had for uploading CSV files to iCloud and it's hosted on a private repo on GitHub. He was saying "I still use BitBucket's private repos"; My response was that the GitHub community is a lot stronger. Outside of community, it was hard for me to convince him that GitHub is worth it.

I've been using GitHub for a few years now to host private and public repos and paid private repos was always a point of contention. Now that they are unlimited, I can say that GitHub is definitely going to be the home to all of my future projects. I really feel like GitHub has been kicking it up a notch in 2016. Awesome work team and thanks!

Rapzid 4 days ago 0 replies      
All those private repos and no way to organize them :( Where are the namespaces/projects github?
benguild 4 days ago 0 replies      
This is nice. Now I won't have to keep deleting private repos to make room for new ones
derrekl 4 days ago 0 replies      
There is one case we have where the newer per seat pricing doesn't facilitate how we're using github. One of our repositories is "docs" with a bunch of markdown files, pdfs, images, and other documents related to our tech. It's mostly used in a read only way by a bunch of non-developers while engineers contribute heavily to the documentation. Paying $9/month per biz person to be able to view the documentation is too much and will force that use case off to Confluence or some other wiki/documenting tool.
jrgifford 4 days ago 0 replies      
Is there a definition of "a few collaborators" anywhere? How many people, and is it per repo or per paid account? Really need more information before I decide if GitHub continues to get my $7/month or not.
Twisell 4 days ago 0 replies      
The main drawback of a lot of Cloud based business model I have seen is that nobody think its fine to pay for leechers.

The per user pricing is pretty reasonable but only when you think of seeders (publishers/editors/pushers call them as you like).

For instance I would love to subscribe to a BI cloud suite that really fit my need, but I'm basically the sole query editor and I have potentially 200 private readers + some public OpenData. I simply just can't come to my boss and ask that we subscribe to this service on a 200 users basis while only one users will really have the use of the license...

Revisor 3 days ago 0 replies      
Why don't more people use Assembla? We've used it for years and it has so many more features than Github. Tickets, milestones, time tracking, standup, wiki, unlimited repos with protected-branch merge rights, file sharing, discussions...

There is no free plan but the pricing is fair in my opinion: https://www.assembla.com/plans

Illniyar 4 days ago 1 reply      
Can two individual priced accounts collaborate on the same private repository?
donatj 4 days ago 0 replies      
We have a large number of people in our organization who have GitHub access who do not code and instead file or manage tickets. $9 a month just to be able to file a ticket is rather steep.
manigandham 4 days ago 0 replies      
It's amazing how cheap people/companies are if they're complaining about these prices.

$9/user/month for one of the best and easy-to-use platforms to store and manage your repos and help your software development, which for most companies is extremely important to their product.

Slack is $8/user/month and yet people have no problem with that pricing. Git is also extremely portable and easy to move and takes minutes to self-host so what's the problem here?

thomascarney 3 days ago 0 replies      
In the spirit of offering alternatives, we created a quick price calculator to show you whether youd be better off moving from GitHub to Planio: https://plan.io/github-alternative/

But dont hate us too much GitHub. We still love you :)

willcodeforfoo 4 days ago 0 replies      
This is awesome! I have wanted a different pricing structure for personal accounts for a long time.

And for those who have issues with the organizational changes, did you see?

> I am an existing organization customer and prefer the per-repository plans. Can I remain on my current plan?

> Yes, you can choose to continue paying based on the number of repositories you use. You can also upgrade or downgrade in the legacy repository structure based on the number of repositories you need.

gommm 4 days ago 0 replies      
That makes a lot more sense in term of pricing and if that had existed earlier, I'd probably not have bothered hosting my own gitlab repository. I like to have a lot of little repos even keeping some of my private experiments and so the limit of repositories never really made sense to me.

It might make sense however to not count collaborators with read-only access.

Of course, now that I have gitlab, there's very little reason for me to come back.

chj 4 days ago 0 replies      
self hosted gitlab, for about 10$/month you get unlimited repos, unlimited users.
keithnz 4 days ago 0 replies      
I like gitthub, I have my open source stuff with github, but when it comes to private repos, bitbucket just seems better pricing and in someways just a nicer and cleaner interface https://bitbucket.org/product/pricing
keithnz 4 days ago 0 replies      
I like github, I have my open source stuff with github, but when it comes to private repos, bitbucket just seems better pricing and in someways just a nicer and cleaner interface https://bitbucket.org/product/pricing
BinaryIdiot 4 days ago 0 replies      
Looks like with our company the price goes from $25 a month to $133 if we move (or are forced to move) over to the cost-per-user model.

GitLab was already looking good, if we're forced to change well likely move to GitLab. Github's pricing was already overly expensive for what you get, in my opinion.

danvoell 4 days ago 0 replies      
I hope this doesn't lead to less open sourced software. Since it will be easier to keep your code private.
andreamazz 4 days ago 0 replies      
As much as I would love to switch to GitHub for our private repos, it still is way more expensive than BitBucket.
wickedlogic 4 days ago 1 reply      
Comments here are mostly from a single org view, due to the many X increase for that orgs price (large teams only)... but if I work on n_orgs repos, I'm now worth 9-25*n_orgs to github. That is a big shift from a model where I had no direct value to them as a unit.
arc_of_descent 4 days ago 0 replies      
So I have a normal user account at $7/month. Great, I now have unlimited private repos.

I also had an organization a/c (only 1 user) at $9/month. I switched to the $25/month, so yes, its now costing me more.

I understand math. Why not just give me $5/user? :)

meetbryce 4 days ago 0 replies      
Seems like a good move, it's unclear to me what the difference is between Personal & Organization.
Aissen 4 days ago 2 replies      
I know a lot of companies that are too cheap to pay for hosting (or even host in-house), and therefore use bitbucket with its unlimited private repos. It's their gateway drug, and once they get used to that, good luck having them move over to github.
mikey_p 4 days ago 0 replies      
I think this is great. I'm part of a small 2-person consultancy (my wife and I) and we've been abusing a user account for our business for sometime on the 'medium' plan since it would give us 20 private repos, although over the last 6 years, we're had to cycle stuff to backups, rotate it around in order to keep older client work in there.

It's been hard to justify upgrading to an organization for awhile, since our work is hit or miss and we both have other jobs form time to time. We aren't much in terms of load on Github, but we'd like to be able to store 40-50 private repos or add more without worrying about our limit. The new organization pricing makes tons of sense for us since it's very close to the old 'medium' plan we were using, instead of being 2.5 times as much, which we never felt we could justify.

jdudek 4 days ago 0 replies      
Yay, no more using single repo with orphan branches to save on number of repositories :-)
tedmiston 4 days ago 0 replies      
> Over the next few days, we will automatically move all paid accounts, from Micro to Large, to the new plan. If youre currently paying for one of those larger plans, look out for a prorated credit on your account.

Bravo, GitHub.

emodendroket 4 days ago 0 replies      
So basically they're going to start using the same model as BitBucket?
napperjabber 4 days ago 0 replies      
Pretty sure this wont end well for Github. They seem to be making a lot of moves like this recently. It's only a matter of time until a mass migration begins IMO.
aavotins 4 days ago 0 replies      
Christmas is early this year.
ausjke 4 days ago 0 replies      
bitbucket still sounds like a better deal as far as money goes, though github somehow catches all the eyeballs. bitbucket has been providing similar service for less since long time ago.
kaffeinecoma 4 days ago 0 replies      
I'm really looking forward to no longer having to figure out which project I have to axe to keep my "small" plan under the 10 repo maximum. That was always annoying.
benbenolson 4 days ago 0 replies      
This is just another reason to move to something like Gitlab or just self-host your Git repos. It takes literally seconds to set up your own Git server, so why not?
wtbob 4 days ago 0 replies      
Definitely cool, but I honestly think that if your organisation needs more than a handful of repositories then it's very likely doing something wrong.
z3t4 4 days ago 1 reply      
What's the difference between a GIT server and say a HTTP server? To my understanding, Github are unable to scale GIT, so they have to price accordingly.
NicoJuicy 4 days ago 0 replies      
I really don't understand why Github sets their prices higher while GitLab ( mostly) is gaining more and more traction...
gshulegaard 4 days ago 0 replies      
So...GitLab looks better and better every day.
alexchamberlain 4 days ago 0 replies      
This is great for private accounts; it encourages better practice of smaller repos.
mikeflynn 4 days ago 0 replies      
Sounds like a lot of companies are going to end up with multiple GitHub orgs.
edpichler 4 days ago 0 replies      
To me this is a good change, I have lots of private repositories and a small team.
kazinator 4 days ago 0 replies      
For less than these price plans, you can have your own domain and server.
geostyx 4 days ago 0 replies      
I think I'll stick with my own private Gogs instance anyway.
jonmaim 4 days ago 0 replies      
Sorry it's too late, I already migrated to bitbucket 6 months ago.
cloudjacker 4 days ago 0 replies      
bitbucket: still unlimited free private repositories
gohrt 4 days ago 0 replies      
What's the delta from the old model?
bfrog 4 days ago 0 replies      
github, soon to be the next sourceforge
softinio 4 days ago 0 replies      
This is fantastic news in my opinion.
sqldba 4 days ago 0 replies      
I love the clickbait. It's missing 3 words - "for paid users". Everyone has clicked it to be disappointed. GitLab++.
ArtDev 4 days ago 0 replies      
I will stick with GitLab.
jtchang 4 days ago 2 replies      
Yay! No more bitbucket for all my private repos. I wonder if this change is because of competition?
mnml_ 4 days ago 0 replies      
too expensive
jiang101 4 days ago 0 replies      
I'm a member of a Github organisation with 63 members and 20 private repositories. As far as I can see, this changes our yearly cost from $600 to $6564.
samir16 4 days ago 0 replies      
Its awesm
cwmma 4 days ago 0 replies      
and bitbucket's sole reason for existing has gone away
Zypho 4 days ago 1 reply      
Everyone who is crying right now would be crying more if Github were to make the price free for private repos because with that, the amount of open source libraries they use would be cut in half.
alchemical 4 days ago 2 replies      
Honestly when I read the title I thought GH switched their business model and offered free users the ability to start a private repo, but this is not the case.

If it is the case that I have to pay to have privacy on Github, then it imposes a privacy-rich versus privacy-poor dichotomy which I am uncomfortable with. Now I know as far as these things go (GH can be subject to National Security Letters), that GH is not really absolutely private. (Backdoors into people's 'secret' GISTS anyone?).

GH had an opportunity here to change their business model so that free users can avail of private repos, and GH could still manage to bring in revenue. GH primarily makes the bulk of their income from what I call 'stakeholder accounts'. That is; those companies who simply couldn't function correctly if GH didn't exist. It is in these stakeholders that there is a symbiotic relationship of revenue for GH, and value for the stakeholder(s).

There are very little lone private individuals who have that kind of symbiotic relationship, and so at least give these low income users the same equal rights of privacy as behemoth tech organizations. It makes sense.

In terms of how GH gets revenue from these users, there are countless other ways to do this instead of relying on the monolithic device of a premium subscription model. Offer paid licenses for their proprietary GH clients. (A one off payment of $20.00 for the GH Windows client is something I would actually pay money for)...

Announcing SyntaxNet: The Worlds Most Accurate Natural Language Parser googleresearch.blogspot.com
1075 points by cjdulberger  3 days ago   231 comments top 36
xigency 3 days ago 5 replies      
Evidence that this is the most accurate parser is here;the previous approach mentioned is a March 2016 paper, "Globally Normalized Transition-Based Neural Networks," http://arxiv.org/abs/1603.06042

"On a standard benchmark consisting of randomly drawn English newswire sentences (the 20 year old Penn Treebank), Parsey McParseface recovers individual dependencies between words with over 94% accuracy, beating our own previous state-of-the-art results, which were already better than any previous approach."

From the original paper, "Our model achieves state-of-the-art accuracyon all of these tasks, matching or outperformingLSTMs while being significantly faster.In particular for dependency parsing on the WallStreet Journal we achieve the best-ever publishedunlabeled attachment score of 94.41%."

This seems like a narrower standard than described, specifically being better at parsing the Penn Treebank than the best natural language parser for English on the Wall Street Journal.

The statistics listed on the project GitHub actually contradict these claims by showing the original March 2016 implementation has higher accuracy than Parsey McParseface.

teraflop 3 days ago 7 replies      
This is really cool, and props to Google for making it publicly available.

The blog post says this can be used as a building block for natural language understanding applications. Does anyone have examples of how that might work? Parse trees are cool to look at, but what can I do with them?

For instance, let's say I'm interested in doing text classification. I can imagine that the parse tree would convey more semantic information than just a bag of words. Should I be turning the edges and vertices of the tree into a feature vectors somehow? I can think of a few half-baked ideas off the top of my head, but I'm sure other people have already spent a lot of time thinking about this, and I'm wondering if there are any "best practices".

fpgaminer 3 days ago 6 replies      
One of the projects I'd love to develop is an automated peer editor for student essays. My wife is an english teacher and a large percentage of her time is taken up by grading papers. A large percentage of that time is then spent marking up grammar and spelling. What I envision is a website that handles that grammar/spelling bit. More importantly, I'd like it as a tool that the students use freely prior to submitting their essays to the teacher. I want them to have immediate feedback on how to improve the grammar in their essays, so they can iterate and learn. By the time the essays reach the teacher, the teacher should only have to grade for content, composition, style, plagiarism, citations, etc. Hopefully this also helps to reduce the amount of grammar that needs to be taught in-class, freeing time for more meaningful discussions.

The problem is that while I have knowledge and experience in the computer vision side of machine learning, I lack experience in NLP. And to the best of my knowledge NLP as a field has not come as far as vision, to the extent that such an automated editor would have too many mistakes. To be student facing it would need to be really accurate. On top of that it wouldn't be dealing with well formed input. The input by definition is adversarial. So unlike SyntaxNet which is built to deal with comprehensible sentences, this tool would need to deal with incomprehensible sentences. According to the link, SyntaxNet only gets 90% accuracy on random sentences from the web.

That said, I might give SyntaxNet a try. The idea would be to use SyntaxNet to extract meaning from a broken sentence, and then work backwards from the meaning to identify how the sentence can be modified to better match that meaning.

Thank you Google for contributing this tool to the community at large.

jrgoj 3 days ago 1 reply      
Now for the buffalo test[1]

`echo 'Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo' | syntaxnet/demo.sh'

 buffalo NN ROOT +-- buffalo NN nn | +-- Buffalo NNP nn | | +-- Buffalo NNP nn | | +-- buffalo NNP nn | +-- buffalo NN nn +-- Buffalo NNP nn +-- buffalo NNP nn
[1]: https://en.wikipedia.org/wiki/Buffalo_buffalo_Buffalo_buffal...

deanclatworthy 3 days ago 1 reply      
It's really nice to have access to these kinds of tools. I am sure some folks from Google are checking this, so thank you.

Analysis of the structure of a piece of text is the first step to understanding its meaning. IBM are doing some good work in this area.http://www.alchemyapi.com/products/demo/alchemylanguage

Anything in the pipeline for this project to help with classifying sentiment, emotion etc. from text?

feral 3 days ago 5 replies      
I'd love to hear Chomsky's reaction to this stuff (or someone in his camp on the Chomsky vs. Norvig debate [0]).

My understanding is that Chomsky was against statistical approaches to AI, as being scientifically un-useful - eventual dead ends, which would reach a certain accuracy, and plateau - as opposed to the purer logic/grammar approaches, which reductionistically/generatively decompose things into constituent parts, in some interpretable way, which is hence more scientifically valuable, and composable - easier to build on.

But now we're seeing these very successful blended approaches, where you've got a grammatical search, which is reductionist, and produces an interpretable factoring of the sentence - but its guided by a massive (comparatively uninterpretable) neural net.

It's like AlphaGo - which is still doing search, in a very structured, rule based, reductionist way - but leveraging the more black-box statistical neural network to make the search actually efficient, and qualitatively more useful. Is this an emerging paradigm?

I used to have a lot of sympathy for the Compsky argument, and thought Norvig et al. [the machine learning community] could be accused of talking up a more prosaic 'applied ML' agenda into being more scientifically worthwhile than it actually was.

But I think systems like this are evidence that gradual, incremental, improvement of working statistical systems, can eventually yield more powerful reductionist/logical systems overall.I'd love to hear an opposing perspective from someone in the Chomsky camp, in the context of systems like this.(Which I am hopefully not strawmanning here.)

[0]Norvig's article: http://norvig.com/chomsky.html

mdip 3 days ago 1 reply      
This looks fantastic. I've been fascinated with parsers ever since I got into programming in my teens (almost always centered around programming language parsing).

Curious - The parsing work I've done with programming languages was never done via machine learning, just the usual strict classification rules (which are used to parse ... code written to a strict specification). I'm guessing source code could be fed as data to an engine like this as a training model but I'm not sure what the value would be. Does anyone more experienced/smarter than me have any insights on something like that?

As a side-point:

Parsy McParseface - Well done. They managed to lob a gag over at NERC (Boaty McBoatface) and let them know that the world won't end because a product has a goofy name. Every time Google does things like this they send an unconscious remind us that they're a company that's 'still just a bunch of people like our users'. They've always been good at marketing in a way that keeps that "touchy-feely" sense about them and they've taken a free opportunity to get attention for this product beyond just the small circle of programmers.

As NERC found out, a lot of people paid attention when the winning name was Boaty McBoatface (among other, more obnoxous/less tasteful choices). A story about a new ship isn't going to hit the front page of any general news site normally and I always felt that NERC missed a prime opportunity to continue with that publicity and attention. It became a topic talked about by friends of mine who would otherwise have never paid attention to anything science related. It would have been comical, should the Boaty's mission turn up a major discovery, to hear 'serious newscasters' say the name of the ship in reference to the breakthrough. And it would have been refreshing to see that organization stick to the original name with a "Well, we tried, you spoke, it was a mistake to trust the pranksters on the web but we're not going to invoke the 'we get the final say' clause because that wasn't the spirit of the campaign. Our bad."

Someone 3 days ago 0 replies      
For those wondering: the license appears to be Apache 2.0 (https://github.com/tensorflow/models)
syncro 2 days ago 0 replies      
Dockerized version so you try without installing:


zodiac 17 hours ago 0 replies      
> It is not uncommon for moderate length sentences - say 20 or 30 words in length - to have hundreds, thousands, or even tens of thousands of possible syntactic structures.

Does "possible" mean "syntactically valid" here? If so I'd be interested in a citation for it.

Also, I wonder what kind of errors it makes wrt to the classification in http://nlp.cs.berkeley.edu/pubs/Kummerfeld-Hall-Curran-Klein...

TeMPOraL 3 days ago 4 replies      
> Humans do a remarkable job of dealing with ambiguity, almost to the point where the problem is unnoticeable; the challenge is for computers to do the same. Multiple ambiguities such as these in longer sentences conspire to give a combinatorial explosion in the number of possible structures for a sentence.

Isn't the core observation about natural language that humans don't parse it at all? Grammar is a secondary, derived construct that we use to give language some stability; I doubt anyone reading "Alice drove down the street in her car" actually parsed the grammatical structure of that sentence, either explicitly or implicitly.

Anyway, some impressive results here.

ohitsdom 3 days ago 2 replies      
I'm sure it's only a matter of time before someone puts this online in a format easily played with. Looking forward to that
rspeer 3 days ago 0 replies      
I'm glad they point out that we need to move on from Penn Treebank when measuring the performance of NLP tools. Most communication doesn't sound like the Penn Treebank, and the decisions that annotators made when labeling Penn Treebank shouldn't constrain us forever.

Too many people mistake "we can't make taggers that are better at tagging Penn Treebank" for "we can't make taggers better", when there are so many ways that taggers could be improved in the real world. I look forward to experimenting with Parsey McParseface.

weinzierl 3 days ago 1 reply      
Say, I wanted to use this for English text with a large amount of jargon. Do have to train my own model from scratch or is it possible to retrain Parsey McParseface?

How expensive is it to train a model like Parsey McParseface?

scarface74 3 days ago 1 reply      
I started working on a parser as a side project that could parse simple sentences, create a knowledge graph, and then you could ask questions based on the graph. I used http://m.newsinlevels.com at level 1 to feed it news articles and then you could ask questions.

It worked pretty well but I lost interest once I realized I would have to feed it tons of words. So could I use this to do something similar?

What programming language would I need to use?

jventura 3 days ago 2 replies      
As someone who has published work in the NLP area, I always take claimed results with a grain of salt. With that said, I still will have to read the paper to know the implementation details, although my problem with generic linguistic approaches such as this one seems to be is that it is usually hard to "port" to other languages.

For instance, the way they parse sequences of words may or may not be too specific to the English language. It is somewhat similar to what we call "overfitting" in the data-mining area, and it may invalidate this technique for other languages.

When I worked on this area (up to 2014), I worked mainly in language-independent statistical approaches. As with everything, it has its cons as you can extract information from more languages, but, in general, with less certainties.

But in general, it is good to see that the NLP area is still alive somewhere, as I can't seem to find any NLP jobs where I live! :)

Edit: I've read it in the diagonal, and it is based on a Neural Network, so in theory, if it was trained in other languages, it could return good enough results as well. It is normal for English/American authors to include only english datasets, but I would like to see an application to another language.. This is a very specialized domain of knowledge, so I'm quite limited on my analysis..

neves 2 days ago 0 replies      
Shouldn't the title be renamed for "The World's Most Accurate Natural Language Parser For English"?

It's impressive how Google's natural language features, since the simpler spell check, degrades when it work with languages different from English.

the_decider 3 days ago 0 replies      
According to their paper (http://arxiv.org/pdf/1603.06042v1.pdf), the technique can also be applied to sentence compression. It would be cool if Google publishes that example code/training-data as well.
mindcrash 2 days ago 0 replies      
Anyone planning (or already busy) training Parsey with one of the alternative Treebanks available from Universal Dependencies [1]? Would love to know your results when you have any :)

I am personally looking for a somewhat reliable NLP parser which can handle Dutch at the moment. Preferably one which can handle POS tagging without hacking it in myself.

[1] http://universaldependencies.org/

joosters 3 days ago 4 replies      
I don't see how a linguistic parser can cope with all the ambiguities in human speech or writing. It's more than a problem of semantics, you also have to know things about the world in which we live in order to make sense of which syntactic structure is correct.

e.g. take a sentence like "The cat sat on the rug. It meowed." Did the cat meow, or did the rug meow? You can't determine that by semantics, you have to know that cats meow and rugs don't. So to parse language well, you need to know an awful lot about the real world. Simply training your parser on lots of text and throwing neural nets at the code isn't going to fix this problem.

aaron-santos 3 days ago 1 reply      
I'd love to see the failure modes especially relating to garden path sentences. [1]

[1] - https://en.wikipedia.org/wiki/Garden_path_sentence

hartator 3 days ago 1 reply      
> At Google, we spend a lot of time thinking about how computer systems can read and understand human language in order to process it in intelligent ways.

There is 6 links in this sentence in the original text. I get it can help to get more context around it, but I think it's actually making the text harder to "human" parse. It also feels they have hired a cheap SEO consultant to do some backlink integrations.

jdp23 3 days ago 4 replies      
Parsey McParseface is a great name.
sourcd 2 days ago 0 replies      
What would it take to build something like "wit.ai" using SyntaxNet ? i.e. to extract "intent" & related attributes from a sentence e.g.

Input : "How's the weather today"

Output : {"intent":"weather", "day":"Use wit-ai/duckling", "location":"..."}

amelius 3 days ago 3 replies      
How would you feed a sentence to a neural net? As I understand, the inputs are usually just floating point numbers in a small range, so how is the mapping performed? And what if the sentence is longer than the number of input neurons? Can that even happen, and pose a problem?
w_t_payne 3 days ago 0 replies      
Cool - I reckon I'm going to try to use it to build a "linter" for natural language requirements specifications. (I'm a bit sad like that).
WWKong 3 days ago 1 reply      
Anyone know a tool that does Natural Language to SQL?
Animats 3 days ago 0 replies      
This could lead to a fun WordPress plug-in. All postings must be parsable by this parser.

Surprisingly, this thing is written in C++.

zem 3 days ago 0 replies      
one interesting use i can think of is new improved readability scores that can take into account words that are common or uncommon depending on part of speech. (e.g. a text that used "effect" as a noun would be lower-level than one that used "effect" as a verb)
instakill 2 days ago 0 replies      
What are some use cases for this for hobbyists?
bertan 2 days ago 0 replies      
Parsey McParseface <3
degenerate 3 days ago 0 replies      
I'd love to let this loose on the comments section of worldstarhiphop or liveleak and see what it comes up with...
vicaya 2 days ago 0 replies      

 1. WordNet 2. ImageNet 3. SyntaxNet ... n. SkyNet

jweir 3 days ago 6 replies      
scriptle 3 days ago 0 replies      
Did I just read it as Skynet ?
PaulHoule 3 days ago 1 reply      

This kind of parser isn't all that useful anyway. Parts of speech are one of those things people use to talk about language with, but you don't actually use them to understand language.

The NYPD Was Ticketing Legally Parked Cars; Open Data Put an End to It iquantny.tumblr.com
863 points by danso  4 days ago   177 comments top 31
danso 4 days ago 6 replies      
For people who skip straight to the comments: I couldn't fit it into the title, but the other half of the OP's title is that the author confronted the NYPD with his data analysis and they thanked him and spoke positively about the effect of open data:

> Mr. Wellingtons analysis identified errors the department made in issuing parking summonses. It appears to be a misunderstanding by officers on patrol of a recent, abstruse change in the parking rules. We appreciate Mr. Wellington bringing this anomaly to our attention...

> Thanks to this analysis and the availability of this open data, the department is also taking steps to digitally monitor these types of summonses to ensure that they are being issued correctly.*

Sure, it's easy to pay lip service to open data and transparency...but I've found for the most part that bureaucracies are generally OK with open data after it's been ingrained in their culture (which is why it is so amazingly easy to get data from Florida -- they have an IT system that is optimized to handle it -- the employees don't mind fulfilling the requests since it's no skin off their back).

Bloomberg got the ball rolling, and hopefully the momentum continues...I'd be happy with NYC agencies tolerating open data as a status quo...Looking forward to the day when bureaucrats and citizens can get to the point where transparency isn't seen as a zero-sum game.

colordrops 3 days ago 5 replies      
An anecdot: when I lived in Manhattan in 2001, I parked near 61st and 1st, between two differing parking signs. I was clearly very close to one, and much further from another, by at least 50 yards. I got a ticket for illegal parking according to the further sign. Being a broke ex-dot-commer, I was furious, so I photographed and measured everything, and put together a seven page report.

After a long while, I got a response back from the responsible agency, which agreed that I was in compliance with the parking rules, and thus they would reduce the fine from $100 to $40. What??

thevibesman 4 days ago 1 reply      
In 2011, I was given two parking tickets when legally parked in Boston.

The first ticket was issued for parking in resident street parking without a permit. I was actually parked just after the sign delimiting the resident parking area[1]. I photographed where I was parked and sent in an appeal which was accepted.

A month later, I was parked in the same non-resident street parking and I was issued a ticket for failing to pay a parking meter; the number of the meter I supposedly failed to pay was a LONG block and a half away from where I was parked. Again, I submitted an appeal, but this time it was not accepted.

The friend I was visiting when I received these tickets was a producer for a local television news station. She told the story of these two tickets to their investigative reporter who was interested in the story and was going to inquire about it with the Boston PD. Apparently she didn't get anywhere with the story, but Boston PD did stop sending me failure to pay notifications for the fine.

[1]: This non-resident street parking was not metered.

(EDIT: Foot-note formatting)

ghaff 4 days ago 2 replies      
As the article explained, tickets had been voided when cases were disputed as widely happens in just about every city.

Honestly, if I were a public official reading the comments here, I'd be inclined to just go "Why do I even bother? I explained how, even though the people who hand out most of the parking tickets were educated about the change, many police officers weren't. We'll fix it now."

Sure, rerun the analysis a year or two down the road. If things don't change, that's a story. But this strikes me as a nice use of open data and a reasonable response from the NYPD. You can argue that they should do pro-active refunds but a lot of addresses will have changed and it would be otherwise difficult and expensive. I'm honestly not convinced that's a reasonable expectation.

wheelchairuser 4 days ago 3 replies      
It seems that there are multiple failures here. 1. Designated approved cutouts should be an easier process to obtain (just try getting a stop sign/crosswalk approved in the city and you'll understand the black hole that is the DOT approval process)2. Over eager officers issuing tickets.

Pedestrian cutouts are there for a real purpose, namely accessibility for people with disabilities and wheelchairs.While it sucks for cars to get ticketed, I for one share no compassion with those people blocking cutouts in general, because they are not offering courtesy for those people who require these cutouts to live.

Edit: For those suggesting that the crosswalks in the middle of the block go "nowhere" consider the following. a wheelchair user parks his/her car and exits the car. he/she will have to cross all the way to the ends of the block in order to get out of the street into safety. even it may seem like it's going nowhere, there is a real functional purpose.

mynameisnoone 4 days ago 1 reply      
NYPD ticketed Casey Neistat for not riding in the bike lane, a viral video put an end to it. ;)


bpchaps 4 days ago 0 replies      
I've been working on something like this for over a year now for Chicago. Most of that time's been spent just simply doing data cleanups.

My most recent update from Monday: https://plot.ly/~red-bin/6.embed

Pretty fucking proud of that, not gonna lie.

bradgessler 3 days ago 0 replies      
This is a good reminder that San Francisco went the other direction and blocked Fixed: http://techcrunch.com/2015/10/12/fixed-the-app-that-fixes-yo...
jasonjei 4 days ago 2 replies      
Did NYPD refund fines from erroneously paid citations?
m0llusk 3 days ago 0 replies      
It might make sense to focus on the core issue of pedestrian safety and curb cuts. As someone who rolls to most of my destinations on wheels blocked ramps can be a major inconvenience that might conceivably compete with parking convenience as an issue.
PhasmaFelis 4 days ago 2 replies      
That's a shame. Parking in front of a handicapped-access ramp is a shitty thing to do even if it is legal.
tibbon 3 days ago 1 reply      
Dozens of motorcyclists in downtown Boston recently had a problem like this. There are 12-hour motorcycle meters (25 cents an hour, woo!), but a parking cop kept issuing tickets for "Over 2 hour parking" at those meters. They assumed they were like the car meters.

I challenged my stack of them, and they relented, but refused to actually write me back with how/why this was happening, and how they were upgrading their training.

What gets me, is that I doubt they did a search through their system for all such tickets, and then immediately issued refunds- or put through a software patch on their ticketing system to disallow those tickets at motorcycle meters.

Refunding can't be that hard.

jrockway 4 days ago 3 replies      
Why risk parking in a questionable area like a ramp cutout when some streets have an entire extra parking lane (indicated by a picture of a bicycle)?
holdenc 3 days ago 2 replies      
Sad to see that a majority of these bad tickets are in poorer neighborhoods. It seems most of Manhattan (below the Bronx) is relatively ok.
DannyBee 3 days ago 0 replies      
So, i would wait till they stop before i would declare victory :)Certainly a great step forward, but my cynical view, after dealing with tons and tons and tons of city governments (it used to be my job):

It's very easy to issue statements saying they've solved the problem. Rarely do they actually solve the problem :)

tedmiston 3 days ago 1 reply      
How do you know that these tickets, given at locations of pedestrian ramps, were not actually due to other things... like uncommon street sweeping hours or those pesky snow parking restrictions that are so hard to keep track of. Is the reason for citation available in the data?
ommunist 3 days ago 0 replies      
In Russian big cities commercial car evacuation for misparked often bribes policeman to cover their activity. But resistance is futile and there is helping app against that http://peapp.ru.
fsaneq2 4 days ago 2 replies      
We need this in SF. SFMTA ticketed my car for no reason before; I tried their "appeal" process, only to get a response that sounded like they didn't even read what I said.
Rapzid 3 days ago 0 replies      
To me this is an example of how transparency through information services is winning. Eventually this will pervade every aspect of government. We are going to be in a much better place when we can see what our government is doing in fine detail; spending, enforcement, funding(think campaigns), voting history, etc. The technology is there now but we haven't quite reached the point where it's all on the table and expected/demanded.
Qantourisc 3 days ago 0 replies      
"..., the department focused training on traffic agents, ..." -> The police got trained, but we are suppose to just know the law. How unjust.
tomohawk 3 days ago 1 reply      
It's not surprising that a small change like this could be enabled with open data, but big changes? There's been plenty of data related to medicare fraud, veteran's admin fraud, etc, but changing big things like that appears to be a no go even though the cost is tremendous in terms of affected lives and money.
yomly 4 days ago 0 replies      
After recently dealing with the various governments / nationalised businesses from a number of countries I can safely say that I am well and truly fed up with their inane bureaucracy.

This article doesn't surprise me: where the recent swath of internet companies seem to extoll individual agency/ownership, nationalised companies seem to preach the exact opposite - a problem is never the individual's responsibility and people seem perfectly happy to ship a borken status quo. As such, ludicrous oversights like this are a regular occurrence rather than a gross intolerable negligence.

This is not even yet mentioning how awful the UX is for the customer who has to interface with these institutions.

While it may be very easy to discard my point as rife with cliches, it's worth noting that we have become very spoilt by ruthlessly frictionless (yet equally effective) customer-facing tech products. The opportunities to streamline processes and save time and money for state-owned interfaces (both online and offline). As an example, I recently had to queue in a hospital with my sick mother for an hour, to be told that that individual couldn't process our particular issue and that we would have to join the back of the adjacent queue (not there was zero signposting whatsoever). This is one example and not a criticism on healthcare, but what happened to service-driven processes?

Instead we, the public, have to live with the dire reality of inefficient (and increasinly apparently indifferent) government services as the gap between private and state run services continues to expand as we grow increasingly disenfranchised.

xapata 3 days ago 0 replies      
The best defense we have against bad actors in government (whether intentional or accidental) is ubiquitous public surveillance -- not the government spying on the people, but the people monitoring the government. A cameraphone in nearly every pocket is a good start.
c64sys64738 3 days ago 0 replies      
You can't park in peace in NY. They hired h1b traffic officers too, I've heard the Americans traffic officers weren't aggressive enough.
known 3 days ago 0 replies      
kevin_thibedeau 4 days ago 1 reply      
The T intersection at Virginia Avenue is not legal. The ramps serve an unmarked crosswalk as exists at all intersections in NY state.
rotrux 3 days ago 1 reply      
Headline -- "The NYPD Was Ticketing Legally Parked Cars; Open Data Put an End to It."

Expanded & Explicit Translated Headline -- "Some NYPD Officers Were Cutting Corners in a Profession Where Cutting Corners is a Relatively Big Deal. Some People With Computers Did a Fancy Search and Now Those Cops Are Rightfully Getting in Trouble and the Shit Is Stopping"

You're welcome.

caiob 3 days ago 0 replies      
Didn't ticket enough. I say, ticket all the cars just for being on the street. We need more public transit.
agjacobson 3 days ago 0 replies      
Nice.Something like this happened to me in 1979 in Berkeley.You can infer how angry I got. I was too busy, at the end of grad school, to even contest it.
guelo 4 days ago 1 reply      
I don't think that's what "systematically" means.
tn13 3 days ago 1 reply      
Whenever I visit NY I dont assume anything is legal. If someone tells me walking barefoot is illegal in NY I would believe it. After all they kill people for sharing smoke and all.
A cave in Romania that was sealed for 5.5M years bbc.com
882 points by ForFreedom  3 days ago   161 comments top 26
tomp 3 days ago 5 replies      
Some interesting takeaways::

- The cave is called "Movile".

- 3 species of spider, a centipede, 4 species of isopod (the group that includes woodlice), a leech never seen anywhere else in the world, and an unusual-looking insect called a waterscorpion

- Movile's only snail [probably the only snail species] suggested that it has been down there for just over 2 million years.

- Many animals are born without eyes, which would be useless in the dark. Almost all are translucent as they have lost pigment in their skin.

- The cave seems to have no contact with the surface; Chernobyl accident had released lots of radioactive metals, which had found their way into the soils and lakes surrounding Movile Cave. However, a 1996 study found no traces of them inside the cave.

- The ecosystem seems to be supported by chemosynthesis; bacteria oxidise methane, sulphide and ammonia, generating energy and organic matter.

ucaetano 3 days ago 9 replies      
"in pitch darkness and temperatures of 25 C"

Is this supposed to be terrifying? Anyone who woke up at night and walked to the bathroom faced pitch darkness and temperatures of 25 C!

ygra 3 days ago 3 replies      
The title is a bit misleading. The cave was discovered in 1986. It's just that this article is a bit more recent.
jgrahamc 3 days ago 6 replies      
Fewer than 100 people have been allowed inside Movile, a number comparable to those who have been to the Moon.

BBC reporting is really kind of crappy. 12 people walked on the Moon; 22 people orbited the Moon.

mh-cx 3 days ago 2 replies      
Maybe I've missed it, but how exactly do they know it was cut off for 5.5M years?
carlesfe 3 days ago 0 replies      
I find this fascinating! Googling a bit, it seems there is a documentary on the Movile cave: http://www.dailymotion.com/video/x1qpffb_the-secret-underwor...
txutxu 3 days ago 1 reply      
> Strangely, the worse the air gets the more animals there

> are. It's not at all obvious why that should be, or how

> the animals survive at all.

Mmmm I think it's the opposite: On more animals using the same static air balloons, less Oxygen will be there.

dc2 3 days ago 0 replies      
A walk-through of the cave:


It gets more interesting starting on the third video.

amasad 3 days ago 2 replies      
Goes to show how resilient life is. Which makes it all the more unlikely that life is all that rare.
dredmorbius 3 days ago 1 reply      
Both HN and BBC titles are rather clickbaity. Pity for what turns out to be an interesting article on non-photosynthesis-based food webs, communities, and metabolism.
amorphid 3 days ago 2 replies      
Are animals in a cave like this considered endangered species?
ommunist 3 days ago 0 replies      
This is fascinating. Life is everywhere. This is real "pitch black", just with smaller monsters.
csours 3 days ago 1 reply      
Perhaps the Andromeda Strain will be found in a cave, not in space...
x2398dh1 3 days ago 0 replies      
What I'm reading is that the cave is like a 5-million year old micro brewing process, and as with many microbrews there are some nightmarish scorpions and spiders and scary creatures associated with the cave's brand. If that allegory is correct, then by exploring that cave we have essentially opened the cork and stuck our finger in, haven't we?
ensiferum 1 day ago 0 replies      
... and then humans came along and filled it with coke bottles, plastic bags, old car tyres and other rubbish.
CodinM 3 days ago 0 replies      
Holy shit my city is #1 on HackerNews.
sengork 3 days ago 0 replies      
Those creatures have evolved on their own tangent. Quite interesting to see how different they are compared to the creatures outside of the cave.

On another note an well hidden cave like that would be great for preserving man made historical artefacts.

censhin 3 days ago 1 reply      
So in summary, life, ahh, finds a way.
udkl 2 days ago 0 replies      
This article reminded me of Terreria and the caves in it.
OJFord 3 days ago 1 reply      

 > became roughly the 29th person to enter

akashaggarwal 2 days ago 0 replies      
Minecraft dungeon discovered!
shawabawa3 3 days ago 4 replies      
> Movile's only snail suggested that it has been down there for just over 2 million years

This is a bit misleading. It should say only species of snail, not a single snail that's been alive for 2 million years

nxzero 3 days ago 5 replies      
Like comments like this, but just a heads up that dang has ban user who repeatedly post summaries in the comments.

EDIT: Here's a recent comment from dang (the main HN mod) on the topic of summary comments:https://news.ycombinator.com/item?id=11608134

sdegutis 3 days ago 2 replies      
Right, just saw this a minute ago on reddit. It seems like HN is just /r/Futurology, /r/TodayILearned, /r/Programming, and /r/Startups rolled into one.
FloNeu 3 days ago 0 replies      
That must be were my motivation was hiding since last week...
Did I just win? twitter.com
941 points by davidtgoldblatt  2 days ago   126 comments top 23
dredmorbius 2 days ago 3 replies      
This reminds me of an old folk tale of the trickster and the rich man.

A king passing through a town finds a man about to be punished for fraud. He intercedes and asks what the matter is. The trickster says in his defence, "I ask people for things, and they give then to me". The king is incredulous but poses a challenge: "You must ask and receive money from the richest man in town." The trickster agrees, but being short on assets, requests a loan. The king obliges, and the trickster arranges (eliding details) to induce the town's richest resident to provide him with a wealth of goods. He returns to the king two days later with evidence in tow. The king is impressed by this demonstration, at which the trickster notes that he'd actually met the conditions 48 hours earlier when the king, wealthier than the town's richest resident, had offered him a loan.

There's something to those old stories.

(I'm not positive of the source but believe it's included in Idries Shah's World Tales.)

tstrimple 2 days ago 6 replies      
1. Create issues for items I need fixed on my github repos.

2. Offer a $100 bounty to people who can trick me into getting some string into my projects. The easiest way to "trick" me of course is to hide it inside of a PR which fixes a real issue.

3. Find and remove the string before merging the PR. I've had one of my issues fixed for free. Rinse and repeat!

Bonus Round: Stage an announcement on twitter and have someone cleverly trick me into including the string on my website (which I was totally going to do anyway). Post clever trick to code geek social media and reap the sweet free viral marketing and hackers trying to earn a Benjamin.

Aelinsaar 2 days ago 4 replies      
It's not clever to hack something that you can socially engineer, and that should be hacking 101. Clever win.
daxfohl 2 days ago 2 replies      
What exactly happened here? All I see is a highlighted line that seems to have already been there.
j79 2 days ago 0 replies      
An acknowledgement of the win: https://twitter.com/DefuseSec/status/730903547747819520

The offer still stands though, if you'd like to try: https://twitter.com/DefuseSec/status/730904219419443200

nkristoffersen 2 days ago 0 replies      
Took me a second to understand what happened. But yes, earned his $100.
joemi 2 days ago 3 replies      
Can someone link to context? Without it, I don't see why this is even posted here.
delibes 2 days ago 0 replies      
Asked a question, won a beer token. It counts.
goatherders 2 days ago 0 replies      
Are some of you actually arguing over whether or not the website qualifies as a "software project?" Goodness, maybe stop taking the world so literally/seriously.
pnathan 2 days ago 0 replies      
That is a gem of cleverness.
drudru11 1 day ago 0 replies      
"Mostly drunk ramblings of a programmer and crypto enthusiast."

Maybe we shouldn't drink and "crypto"? :-)

Jeremy1026 2 days ago 0 replies      
Troll level = 100%
satysin 2 days ago 0 replies      
Just beautiful :)
anaolykarpov 2 days ago 3 replies      
Would you pay 100 usd to get on the front page of HN and who knows what other popular sites?

Maybe it's just a marketing stunt

shadykiller 2 days ago 1 reply      
But wait, how did it happen ?
clapinton 2 days ago 0 replies      
This just made my day.
russelluresti 2 days ago 0 replies      
slow clap.
aaroninsf 2 days ago 1 reply      
[[ obligatory reference to Betteridge's law ]]
kauegimenes 2 days ago 1 reply      
Another way to win this bounty would be to share some code with the string BackdoorPoCTwitter with the same color as the page background. If he copy and paste the code it could work. ^^
dragontamer 2 days ago 4 replies      
I guess a webpage is a software project...
msoad 2 days ago 3 replies      
Social Engineering is not accepted in most hacking contests.
eridius 2 days ago 2 replies      
Calling a website that happens to host static content in the same repo as its PHP source a "release of a software project" really seems like a stretch.
Electron 1.0 is here github.com
677 points by alanfranzoni  4 days ago   393 comments top 45
grinich 4 days ago 9 replies      
Our team at Nylas has been incredibly lucky to build on the shoulders of the folks at GitHub and I'd just like to thank Kevin, zcbenz, Jessica, and their entire team. They've been awesome to work with and super supportive of this new community.

Our early prototypes of Nylas N1 were built on Angular in the browser, and then Adobe Brackets, but we couldn't really get it to a level that felt great and worked offline. It wasn't until we decided to fork Atom (the previous base of Electron) that we started breaking past the "uncanny valley" of wrapped webviews and discovered an application stack that allowed quick cross-platform deployment alongside native code modules.

After building on Electron/AtomShell for 18 months and seeing the community grow, I can definitely say there is something really special here. We've still got a lot of work to do on N1 (email is hard!) but we're confident Electron is the right platform for this type of extensible app.

A secondary reason we open sourced N1 was to serve as a reference implementation of a large Electron app. There still isn't a great resource for "best practices" when creating complex Electron apps with lots of moving parts. Now that we've hit 1.0, I think it's time to change that!

If you have a free afternoon, I definitely recommend playing around with Electron. It will likely change your outlook on the future of desktop software. :)

etatoby 4 days ago 14 replies      
I'm happy and grateful for any and all open source software, because it enriches everybody, well beyond the scope of its creators. But someone has to say it:

Electron is the cancer that is killing desktop computing.

It all started years ago with Firefox, whose interface itself was built using web technologies, in a "brilliant stroke." DOM, CSS, Javascript... maybe not HTML per se, but an XML substitute, and so on. I dare anybody say that Firefox's interface has ever felt as fast as IE, Chrome, Opera, or Safari (on Mac.) It never did and still does not.

Then someone at GitHub had the bright idea to take this "winning" concept and apply it to a developer's text editor, of all things! I still cannot fathom how Atom can have more than 3 users. Every time I've tried it, I've ditched it after 30 seconds. Slooooooooooow!

Fast-forward to 2016: now I see new Electron apps popping up every other day. Even something as simple as a text-only WhatsApp IM client, which could be written in a dozen of C++ files, is a bloated monstrosity that eats RAM for breakfast and contains an entire Node.js interpreter and a Webkit layout engine.

Cancer, I say!

Kill it with fire!

Philipp__ 4 days ago 3 replies      
While there are many awesome things about Electron, I still give native desktop app advantage. Native apps just feel better. It's all those ms (miliseconds) here and there that in the end make huge difference. Still best showcase is Sublime v Atom. I use Atom, because it's free and open-source, but when I fire up Sublime from time to time, it is amazing how better it feels. It feels right! Guess like everything else in life everything has it's good and bad sides.
StevePerkins 4 days ago 8 replies      
I can definitely see a niche in which Electron serves well.

However, it seems weird that when talking about mobile apps, the PhoneGap/Cardova "web wrapper" concept is derided as awful compared to native code or maybe cross-plat frameworks. From my anecdotal experience, those same people tend to think that Electron or NW.js are the greatest thing since sliced bread.

Electron is PhoneGap for the desktop. That's not necessarily a bad thing or a good thing, it's just a thing that makes sense in certain use cases and not others. The fact that web wrappers have different levels of "street cred" across mobile/desktop contexts feels more subjective than objective.

I suspect it's simply a matter of the younger crowd having more exposure to native mobile development, and little or no experience with native desktop development... so this discrepancy reflects those different comfort levels.

bengotow 4 days ago 3 replies      
I've been building on Electron for the last 18 months (@Nylas), and it's really impressive how far it's come in the last year.

Coming from native Mac OS X development, Electron is an breath of fresh air. There's an incredible amount of energy and momentum in the web development community these days, and things like Flexbox, ES2016, React, and ESLint make it possible to ship fast and iterate quickly. Who would have thought JavaScript would be achieving what Java/Swing set out to do in the 90's?

I've had a chance to work with the core team on a handful of bug fixes and new features, and they've been incredibly kind and welcoming. I think Electron will go far as an open source project, and I'm excited that GitHub is growing it beyond Atom.

If you're in the SF Bay Area and interested in learning more about Electron, there's a meet-up coming up later this month! http://www.meetup.com/Bay-Area-Electron-User-Group/

lpsz 4 days ago 6 replies      
Am I the only one unhappy with the trend of moving toward web-wrapper applications? As a developer, I also love the idea of cross-platform, and of more elegant frameworks, but it pains me to run things that are slower or hog the battery.

* JavaScript-heavy Spotify now vs. Spotify a few years ago

* Atom vs. something like Sublime Text

* Or, Slack that takes the cake especially if you log in to multiple Slacks.

It's cool for developers. It's not cool for the users.

ksec 4 days ago 2 replies      
I know this is not really electron's fault, but could there be way to shrink these run times to a lot smaller?

Like Whatsapp App on Desktop you literally have a 60MB download and 200MB install just for a screen that is EXACTLY the same as the Web Version of Whatsapp.

While I dont expect all the apps are like CCleaner or the old uTorrent which is only a few MB in size, 200MB is just redicously large.

gbugniot 4 days ago 9 replies      
"Electron has lowered the barrier to developing desktop applications". Nope, I don't think so. Perhaps for web developers.

For non-web developers, there are already well-suited technologies to develop desktop applications: Swing, Qt, Cocoa, GTK, WPF/WinRT, and so on. Maybe theses technologies seem less sexy but there are here from the beginning to do that kind of job.

Please, use the right tool for the right job.

greenspot 4 days ago 0 replies      
My biggest gripe with Qt is HiDPI support which was introduced just 6 months ago, so quite late. It feels cumbersome compared to HTML/CSS which have native HiDPI support built-in for years. Also the implementation in HTML/CSS is straightforward. You don't need to think about it a second, it just works. Actually this is my number one feature of web tech, it's HiDPI ready and websites and apps like Slack look always slick, crisp and clear, on any platform.

With Qt, just watch this 30 minutes presentation about Qt and HiDPI, while ok there are some parts which unnecessarily complicate matters: https://www.youtube.com/watch?v=2XPpp4yD_M4

fsloth 4 days ago 5 replies      
What do you think the suitability of Electron is for software that is designed to provide revenue with per seat licences and no service component? I.e. 'traditional desktop app'. It seems to me, given how easy it seems to be to copy and modify an Electron application a third party could always copy an application, whitewash it and capture a portion of the market.

The counter point to this is that ownership of the developer brand and distribution channel is which will always drive sales. I'm not sure which aspect is more important when planning a technology platform and revenue model...

porker 4 days ago 0 replies      
Congratulations on reaching 1.0. Electron is really interesting, but until memory usage can be curbed, it's going to be a limiting factor.

I have been going back and forth with Slack over their desktop client; on Windows when signed into 5 groups it uses 1GB RAM. For a rich-chat client.

And if you're thinking "But RAM is cheap these days" -- well yes it is, but by the time you've got 15 Chrome tabs open, a Jetbrains IDE and 3 VMs, plus assorted other software running, 16GB disappears very fast...

albeva 4 days ago 4 replies      
Slow, sluggish, resource hungry and looks alien everywhere. Yeah right. Progress ... If there is one thing its done is lower standard for native applications everywhere.
jokoon 4 days ago 0 replies      
Burn this.

Instead, please, can any investor just hire any computer science PhD (who eventually specialized in compiler engineering), and tell him to work on PREPARSED or COMPILED HTML, and make a browser module out of it, for the sake of speed and memory footprint?


e0m 4 days ago 0 replies      
Think about how much time people spend in front of 24" screens with a keyboard. "Desktop", or whatever that evolves into, is overwhelmingly where real work still gets done. For as critical as that environment is in the modern workplace it's historically been drastically underserved by this community.

Yes, there are a lot of tools like Swing, Qt, Cocoa, GTK, WPF/WinRT, but their development communities are much smaller than the javascript/web ecosystem. They also create enormous portability problems, particularly in an environment (especially in the business world) so heavily dominated by Windows machines.

This community is acutely aware of what happens when the barrier to entry is lowered and the development tooling improved. The tooling that Electron provides via Chromium is also something that should not be understated. Chromium's dev tools are remarkable and improving every day. Few other ecosystems, especially old desktop ones, have debugging environments as rich. The painting performance / profiling tools alone go to great lengths to keeping everything running at 60fps. Furthermore modern CSS layout engines like FlexBox give you an enormous head start on a notoriously difficult problem and is a joy to work with when you don't have to worry about browser compatibility.

I will admit, the getting-started cost of Electron is high. Shipping all of Chromium and Node is no small feat and frankly probably not suited for a minor utility. However once an app crosses even a modest level of sophistication the benefits of this environment are definitely worth it. There are also several specialized tasks that Node probably isn't suited for. Luckily, since you have full process control and the ability to compile native modules any additional specialized task can be run on the side.

The past 18 months working with Electron at Nylas have been some of the most enjoyable development experiences of my life. Much of the crap that frustrates people out of web development (compatibility issues) go away. Being able to build an app at the core of people's day-long experiences is deeply satisfying and something I'm indebted to the Electron team for.

If you're in the Bay Area and still have questions or are just curious, come join us at the Electron Meetup on Wed May 25: http://www.meetup.com/Bay-Area-Electron-User-Group/events/23...

staticelf 4 days ago 1 reply      
One issue I have with Electron is that if I put my computer to sleep without restarting it and have all the applications up all the time, Electron apps gets very slow and sluggish and lags very much. Slack is a good example of this.

I have to restart my computer every now and then in order to keep using Slack and Atom, because otherwise it lags so much. I don't know, I think I prefer UWP apps instead.

colordrops 4 days ago 2 replies      
My team is trying to use Electron to build and deploy apps for Linux, Mac, and Windows, but running into issues with the Mac build. Getting a dmg built on Linux (our build server) is apparently not trivial. Is there a docker image or some other project that wraps all the dependencies up to build Electron apps for all major platforms?
mherrmann 4 days ago 0 replies      
Electron is really cool, unfortunately its startup performance is slow (several seconds on a ~2010 machine). That's why I had to pick Qt for a file manager I'm launching... (https://fman.io)
twotavol 4 days ago 0 replies      
Every Electron application I've used has been sluggish and straight up pathetically slow compared to any native counterparts. I think there are two reasons its picking up steam:

* Your can now make your web devs (HTML/CSS/JS etc) do your application development as well

* General popularity of web development is exploding

* There's no comparable free, permissive native framework. The closest thing is Qt and its LGPL. No one wants to touch the GPL.

webXL 4 days ago 0 replies      
I just came across a great Electron app called Nativefier (https://github.com/jiahaog/nativefierhttps://news.ycombinator.com/item?id=10930718) last night. It's a super easy way to make dedicated browsers & launchers for certain web apps. (very anti-web, I know)

My first use case for it was making a dedicated Amazon Video client for my wife's laptop (although Nativefier doesn't have one of the necessary plugins) so we can have our regular Amazon accounts separate without switching credentials all the time. But I can think of a bunch more for this

kristianp 4 days ago 2 replies      
It would be nice if Electron could enable cross-platform apps, written in languages other than javascript, that compile to native for the back end and to javascript for the front-end.
john_reel 4 days ago 2 replies      
Electron has gotten a lot of criticism for being bloated. Are there any good alternatives for making desktop apps that are scripting language based? Id especially like to use Lua(JIT), but Im not aware of anything anywhere near as reliable, cross-platform, easily deployed, and easy to build interfaces in as Electron. Im not the biggest fan of the JS and/or the JS ecosystem, but I would love the speed, lightness, and convenience of another language (just look at how fast and lean Lua with LuaJIT is!) with the power that some JS tools like Electron have to offer.
crisnoble 4 days ago 2 replies      
Is there a "built with electron" type directory out there somewhere?
calsy 4 days ago 0 replies      
This is awesome! I wish I had something more constructive to say but thats all I could think of at the moment.
stuaxo 4 days ago 2 replies      
Are there plans for electron with the servo browser ?
mixmastamyk 4 days ago 1 reply      
Interesting, reminds me a bit of XUL and XULRunner from the old days, which sort-of failed. What does this have in common, or is different this time?
iamcreasy 4 days ago 4 replies      
Electron looks like it has for both worlds, but what is the drawback of this(kind of) system(s)?
hs86 4 days ago 1 reply      
Would a dynamically linked, system-wide Electron installation help with the extensive resource usage of Electron apps? It would decrease the installation size but what about the runtime performance?
theknarf 4 days ago 0 replies      
I like the idea of making desktop apps with html, css and JavaScript. I just wish that Electron/Atom wasn't so horrible slow. It shouldn't take me more time to open up a desktop app than a webpage.
Longwelwind 4 days ago 3 replies      
I find it more comfortable to make a GUI using HTML/CSS, but I wish I could use a more modern and more reliable language than Javascript.

Is there an equivalent to this but with Python or C# ?

marcosscriven 4 days ago 1 reply      
One of the things I really like about Electron apps is they look good consistently across platforms, which is something I don't often see for cross-platform apps.

QT seems to be the other major cross-platform framework, but I've never seen one that looks good IMO.

Would be very curious if any fellow HNers could point to good alternatives (preferably C++ based, but as I allude to, I'd compromise by using something like Javascript if it meant a better real world outcome for the user).

felixrieseberg 4 days ago 0 replies      
If you're based in San Francisco, come to our next meetup! We'll have a bunch of people working on and with Electron there.


marcosscriven 4 days ago 1 reply      
Related to another comment I posted here, what's the opinion lately on native controls/UI widgets vs something cross-platform like this?

It always used to be argued (and maybe still is) that using the UI framework of the platform is preferable, but IMO the results of Electron apps I've seen look great, and I don't find them confusing to use.

The reverse seems to be true on mobile platforms, where people seem to be preferring native UI widgets and behaviour.

mgkimsal 4 days ago 0 replies      
It's too bad the titanium platform with desktop/mobile/js/web combo never got more momentum. Building a 'web' app - with php or ruby - and bundling as desktop app was pretty darn cool. I only dipped my toes in around 2011 or so, and was sad to see it never get more traction. :/
tmarkovich 4 days ago 0 replies      
On a slightly unrelated note, I love the theme of the Electron plot. Is there any place where I could find it?
voltagex_ 4 days ago 0 replies      
So CatLight [1] seems to be a C# app using Electron for UI. I'm assuming there's a local web server running, but I wonder how this works from a build perspective? I wonder if Electron can be built with MSBuild.

1: https://catlight.io/

pj_mukh 4 days ago 1 reply      
Hmm? Slack's MacApp purports to use MacGap not Electron (https://github.com/MacGapProject/MacGap1).
robotnoises 4 days ago 2 replies      
Anyone with experience with Electron and NW.js have an opinion on which is better?
jweinstein 4 days ago 0 replies      
Congrats to the Electron community!

We've been excitingly using Electron at Wagon to build SQL analytics tools. We're using Haskell + React + Electron as our primary technologies: wagonhq.com/blog/electron

cdnsteve 4 days ago 2 replies      
I haven't used Electron, but what about mobile apps? I have a need for an app that works on desktop and mobile. So is the recommended option Electron for desktop and for mobile PhoneGap/Cordova?
iconjack 4 days ago 2 replies      
Microsoft tried something similar years ago with HTML Applications (.HTA), essentially web apps that could talk to the file system. Didn't seem to work out, though.
justncase80 3 days ago 0 replies      
Congrats! This is an awesome tool and I hope it ushers in a whole new generation of desktop apps!
gontard 3 days ago 0 replies      
"any application that can be written in JavaScript, will eventually be written in JavaScript" this seems to be more and more true and depressing. A JavaScript hegemonia.
dang 4 days ago 1 reply      
This breaks the HN guidelines, which ask you not to call names ("you are acting like a child", etc.) in comments here. Please post civilly and substantively, or not at all.



We detached this subthread from https://news.ycombinator.com/item?id=11674670 and marked it off-topic.

uola 4 days ago 0 replies      
To me, unless things have changed, electron is crazy. Many people today use some sort of abstraction layer that generates their html/css. I really hope they move towards interfacing js or node.js directly with skia (which is the graphics engine for chrome and used by, I think, sublime text).
rufb 4 days ago 0 replies      
This reminds me of Shoes. Of course, Javascript isn't designed for simplicity like Ruby, and Electron is aimed at serious product development rather than novice and weekend programmers. The product design just isn't "there" yet if we are to measure it by _why's standards. But it's nice to see an ACCESSIBLE actively-maintained tool for making cross-OS apps that "just work" again.

In fact, _why released Shoes around 2007 and disappeared in 2009. To put this in perspective, Google Chrome and the 1st iPhone were released in 2008. The web standard mess (JS in particular) was only beginning to be untangled back in 2007. Maybe today's _why would have preferred to try and make Javascript more approachable rather than choose a simple language like Ruby and make it more powerful. I don't know.

At any rate, the trend for things like Electron is to become increasingly complicated to the point where it merits a mention in developers' CVs as it happened with Rails and Node and countless other frameworks before. Hopefully at this point there are players with stakes high enough at making things accessible to push for an entry-level version of Electron. Maybe Codecademy or one of its cousins.

Save Firefox eff.org
700 points by DiabloD3  4 days ago   257 comments top 32
azakai 4 days ago 8 replies      
> [The W3C] needs to hear from you now. Please share this post, and spread the word. Help the W3C be the organization it is meant to be.

This isn't about the W3C.

This is about EME, and about the companies that created it and promoted it: Google, Microsoft and Netflix (as you can see on the spec, for example https://www.w3.org/TR/encrypted-media/ ).

Telling the W3C not to do DRM is not going to be effective. The only thing that can work is to put direct pressure on the parties behind EME, and their products: Google and Chrome, Microsoft and IE/Edge, and Netflix.

Not only is it not effective to focus on the W3C, it's counterproductive - it shifts the blame away from the real culprits just mentioned. If you lobby the W3C against EME but still use products from the companies that created EME, you're sending mixed messages at best.

Furthermore, even if somehow we got the W3C to not do EME, it wouldn't matter. Google, Microsoft and Netflix would still be implementing it. They would just find another standards body.

fpgaminer 4 days ago 1 reply      
Media DRM is not and never was primarily designed to prevent piracy. Rather, DRM is used by content producers (Fox, Disney, Warner, etc) to assert control over the rest of vertical market. This article is a prime example of this. Thanks to DRM the movie studios force browser vendors to sign agreements to get access to the CDM, and from that agreement they can assert control. They can subtly suggest, for example, "Hey, Mozilla, could you revamp your plugin API to make blocking ads harder? It's fine if you don't, but, oh, by the way, your CDM agreement expires next month. Looking forward to seeing you at the re-negotiation meeting."

The same thing goes for encryption on Blu-ray discs, which forces Blu-ray player manufacturers to sign agreements with them. HDCP on HDMI and DisplayPort asserts control over TV manufacturers and infests video cards.

This is the same industry that pushed the DMCA on us, extends copyright in perpetuity, sues families because their kid downloaded an MP3, would like nothing more than SOPA to pass, etc, etc.

I know that the comments here like to demonize Google, Microsoft, Netflix, etc. Honestly, I don't believe it's their fault; Netflix in particular. Netflix is in no position to fight this. If they say no, the media empire will pull all their licenses and the company will collapse. And Netflix is already fighting for its life against these same companies for net neutrality (the major ISPs are owned by the media empire...). Google is leashed by its need for advertising revenue. Microsoft is beholden to its customers, who want access to DRM'd content.

In other words, we shouldn't be taking our fight to the W3C, Google, Microsoft, Netflix, etc. The media empire is the real enemy here. And there's hope. The rise of cheap, digital cameras and distribution platforms like YouTube and Twitch have enabled a wide array of independent artists to create AAA content mostly unbeholden to the incumbent media giants. Some of the best and most entertaining content I've watched has come from Patreon funded YouTubers. If that was the only content that the world watched, the media empire would starve and whither away, and DRM along with them.

jflatow 4 days ago 3 replies      
> This system, "Encrypted Media Extensions" (EME) uses standards-defined code to funnel video into a proprietary container called a "Content Decryption Module." For a new browser to support this new video streaming standard -- which major studios and cable operators are pushing for -- it would have to convince those entertainment companies or one of their partners to let them have a CDM, or this part of the "open" Web would not display in their new browser.

This is the crux of the issue. The W3C is creating a standard which gives control to the publishers over which browsers can display their content.

Whether that's "right" or "wrong" is worth debating, but sometimes the real issue at stake gets obscured in these discussions.

k-mcgrady 4 days ago 3 replies      
Honestly, I'm not against online/streaming content being protected with DRM. I don't think it's very effective but it doesn't effect me as I don't own the content so I don't really care.

This seems to be a step to far though. The browser should be a standards based 'viewer' that anyone with the will and the time can create. Let's say Netflix implements this DRM. They account for more than a third of internet traffic. If your browser can't support Netflix it's dead in the water.

This is open to so much abuse. The gatekeepers (it seems to be the entertainment companies in this case) get to choose which browsers live and die. As we've seen over the last 20 years competition in the browser space is very important - without Mozilla stepping up and competing with IE I can't imagine the sorry state the internet would be in today.

Edit: Once again, the DMCA rears it's ugly head. Time and again it seems to be the thing that is abused to screw over consumers. Maybe that's what we should actually be fighting against.

hsod 4 days ago 2 replies      
> This system, "Encrypted Media Extensions" (EME) uses standards-defined code to funnel video into a proprietary container called a "Content Decryption Module." For a new browser to support this new video streaming standard -- which major studios and cable operators are pushing for -- it would have to convince those entertainment companies or one of their partners to let them have a CDM, or this part of the "open" Web would not display in their new browser.

Isn't this just a standardization of the status quo, with Flash/Silverlight? Why is it that I always feel like I'm being sold a bill of goods when I read EFF pieces?

Ileca 4 days ago 1 reply      
You can test your convictions by disabling DRM content in Firefox. Uncheck "Play DRM content".

Unfortunately, convictions won't have consequences on future decisions because the standard is here and the more you wait the more it becomes embedded. W3C allowed it to come to light when various plugins wouldn't make DRM viable or at least more difficult to implement and reach general agreement. Now, even if you can opt out with Firefox, Netflix really don't care about that because you decided to disable it so you are a bad client anyway. I understand why the article is talking about pop-ups because the moment Firefox decided to implement it, we lost the fight. I use Firefox but lately, I am saddened by their lack of strong convictions and how they tend to follow google a little too much. (At least, FF sandboxed the CDM, while not perfect, the other browsers didn't do it, isn't it?)

maker1138 4 days ago 2 replies      
The biggest problem is intellectual property. Copyright lasts life + 70 years and patents last 20 years. That's a long time to have a legal monopoly on something, and is partly why companies are so big and can behave so badly.

Innovation comes through competition, not monopoly. Ideally, we'd eliminate patents and copyrights altogether, but as a compromise, I think having terms of 3 years, with no renewals, is fair. That way a business can capitalize on what it creates and get a 3 year head start on competition, but you still get competition fairly soon which benefits consumers.

usernamebias 4 days ago 2 replies      
Can someone explain why we're stoking the fire this late in the game, not that it shouldn't be?

Firefox implemented this since May 12, 2015 -- https://blog.mozilla.org/blog/2015/05/12/update-on-digital-r...

Chrome's had it since v 42


developer2 3 days ago 0 replies      
Are clickbait titles permitted on HN? The link has absolutely nothing to do with Firefox, let alone "saving it". It's an opinion piece / call to action regarding the W3C and the state of Encrypted Media Extensions. "Firefox" does not belong in the title, as it's irrelevant to the topic. Luring us with the name of a popular open source application, to then present a piece with a barely-related agenda behind it should not be acceptable.

As a side note, I'm sad to see that the EFF has adopted a PETA-like strategy to the way they tackle issues.

brianpgordon 3 days ago 0 replies      
> users want to sit in the driver's seat.

> We need more Firefoxes.

> We need more browsers that treat their users, rather than publishers, as their customers.

Until they started talking about DRM I was hoping that we were "saving Firefox" from mandatory extension signing.

As of Firefox 47, you will not be able to install any extension which hasn't been digitally signed by Mozilla. There will be no about:config setting to override this. They claim that this will prevent adware from disabling the digital signature requirement. But it's also taking power out of the hands of users, with the justification that supposedly Mozilla knows better than their users do what code they want to run.

This is the death-knell of Firefox for me. I'll be switching to an unbranded fork and hoping that the security updates keep coming.


xori 4 days ago 0 replies      
I'm interested to see how effective the EME is to prevent illicit copying of media. YouTube and Netflix both use DRM now but it doesn't stop youtube-dl or pirate WEB-DL rips from netflix from existing.
AtticusRex 3 days ago 1 reply      
Question: He says EME will allow publishers to dictate which browsers can implement CDMs that can interoperate with their content, and therefore control the browser market, and that this will quell innovation. I have questions about this, however. In the old but waning status quo, Adobe and Microsoft got to decide which browsers would work with Silverlight and Flash (right?) so it still wasn't possible for a developer to make a new browser that could play DRMed video without getting their permission. What is the meaningful difference from the new status quo?

Is the difference that now, publishers control content and compatibility, whereas before publishers controlled content and DRM companies controlled compatibility? Is that actually a meaningful change for users or for browser developers? It doesn't seem like it is.

Am I missing something?

xvilka 4 days ago 0 replies      
There is a way to protect the content by adding per-user (subscriber) watermarks in the video/audio streams. Thus, no one will need these shady CDMs and Co. Of course, you say, you can try to find those watermarks/etc. But in the same way you can try to circumvent CDM code as well. Still, it will allow to eliminate proprietary extensions from the web standards.
majewsky 3 days ago 0 replies      
Honest question: How relevant is DRM in preventing piracy in non-interactive media?

Consider a theoretical world in which DRM would reliably prevent unauthorized copying or decryption of DRM-secured content 100% of the time. The obvious attack vector for pirates would be to play the video and audio and just capture it with a camera directly in front of the monitor, and a microphone attached to every speaker.

Are pirates doing this today, or is it just not worth it because DRM schemes are easily circumvented? I'm quite confident that copying of the physical signals should produce good results. There are consumer cameras capturing 4K video, and a video that's distorted by a non-orthogonal view on the screen can trivially be fixed in software. (It loses some fidelity, but you should still be able to get near-full-HD output out.)

wahsd 4 days ago 1 reply      
I just found out that Firefox removed the 3D Inspector with v.47. It's a shame because that was an excellent tool for auditing and inspecting. If you haven't had the chance, give it a whirl.
pmoriarty 4 days ago 0 replies      
Better solution: repeal the DMCA.
jakobdabo 4 days ago 2 replies      
Isn't it trivial to reverse engineer the DRM module to create its clean room open source implementation thus effectively deprecating it?
SmellyGeekBoy 3 days ago 1 reply      
> Literally none of the dominant browsers from a decade ago are in widespread use today.

Sorry to nitpick and detract from the real point here, but unless my memory deceives me IE was the dominant browser in 2006 and by a lot of measures still is. What a bizarre statement to make.

phn 3 days ago 0 replies      
Well, the issue is that popups were a nuisance, while being able to watch all those publishers content is not.

I totally understand the concerns, but making users choose something out of ideology is much harder than simply providing a better experience.

baby 3 days ago 0 replies      
I'm actually more concerned by the "save itunes" ( https://news.ycombinator.com/item?id=11670232 )
Karunamon 4 days ago 6 replies      
Let's get something straight here. This EME debacle was never a choice between DRM and no DRM, it was a choice between DRM in a consistent standard vs DRM with a thousand ad-hoc plugins.

The browser without EME will be pilloried by its users for not supporting the content they want to access. Users use a browser to access content, not to support philosophical positions on what software should and shouldn't do.

The lesser of two evils was chosen. You don't have to like it, but that's the reality of this situation. It is not realistic to suggest that the largest browser vendors not support user demanded content.

Speaking of philosophical positions, most DRMed content accessed by a user in a browser is going to be of the streaming variety, i.e. something that DRM isn't preventing you from doing something you're otherwise not supposed to be doing anyways.

dredmorbius 3 days ago 0 replies      
The problem here is capture. W3C has been captured by the digital restrictions management cabal. Mozilla, Google, Apple, and Amazon are playing along. In three cases, they are the cabal.
reacweb 3 days ago 1 reply      
Currently, watching a DRM protected video requires flash and gives an inferior experience than watching a non DRM protected video. Most of the non protected video can be easily downloaded using youtube-dl. Recently, I wanted to watch a movie on M6 live (french TV). Firefox (on wine to have the latest upgrade of flash) crashed twice. As a result, I downloaded it from a torrent and removed it after watching it.

I think the current situation gives a lot of motivation to avoid DRM. If EME becomes a standard, we would lose much.

0x0 4 days ago 1 reply      
If firefox really cared about its users maybe it should stop force-feeding "value-adds" like Hello and Pocket down everyone's throat by default.
literallynone 2 days ago 0 replies      
Just don't pay for DRM'd content. Use torrents, libgen, etc.
746F7475 3 days ago 1 reply      
I thought from the title that this was a plea for Mozilla to rewrite Firefox from scratch since it such a bloated mess
LoSboccacc 3 days ago 0 replies      
firefox invested more time in the omnibar nobody wanted than in fixing compatibility issues.

we are actively discouraging people using firefox because whenever we try to use anything modern, firefox will fail it.

the wonky outline implementation has been borked for more than a luster, has multiple bug opened and ignored etc. https://bugzilla.mozilla.org/show_bug.cgi?id=687311

and I found people complaining as early as this http://www.webdesignerdepot.com/2010/03/css-bugs-and-inconsi... and now I wonder how many of those are still there.

firefox cornered itself out of relevance, and this

"We need more Firefoxes.

We need more browsers that treat their users, rather than publishers, as their customers."

doesn't match with firefox priorities as observed so far at all. firefox needs to save firefox.

jbmorgado 3 days ago 0 replies      
What Firefox needs to be saved from, it's from Mozilla.
_nato_ 4 days ago 1 reply      
Can someone clarify what is meant by `publisher' in this piece?
neurobuddha 4 days ago 1 reply      
Doesn't Mozilla have a lucrative deal with Yahoo? I mean Yahoo!
shams93 4 days ago 0 replies      
Servo might save it the c++ codebase for firefox is a nightmare but servo could wind up taking back the crown from chrome.
binaryanomaly 4 days ago 1 reply      
As much I would love to... Just recently switched from FF to Chrome since the latter just works technically a lot better :-(

Hope the new servo engine can make FF shine again otherwise I fear the worst.

A Farewell to FRP elm-lang.org
768 points by sharksandwich  5 days ago   240 comments top 46
ccapndave 5 days ago 7 replies      
I moved from React/Redux/Typescript to Elm a few months ago, and I'm now on my second serious project with it, making mobile application with Cordova. I've found it an absolute pleasure to use, and I especially like that you can be pragmatic about it; if there is something that is annoying to do in Elm then you can simply drop into Javascript/Typescript using ports.

Coming from languages like Actionscript/Javascript/Typescript/PHP I have found the whole Elm experience quite mindblowing; if it compiles, then it just seems to work. I hardly ever find myself poring over bits of code trying to debug a subtle error, and if I do the problem is going to be in the Javascript/Typescript bit.

Basically, I'm sold :)

junke 5 days ago 2 replies      
I am happy to see Elm evolving and it looks like a good framework, but there is a tendency in FRP articles to ignore prior work, as acknowledged at the end of the article:

> Note: Interested readers may find Lucid Synchrone interesting. Unfortunately for me, I had no idea my thesis had so much in common with synchronous programming languages at the time, but the connections are quite striking. I might argue that Elm was never about FRP.

My thesis was related to synchronous programming languages and articles about FRP tend to have too little to say about them, for my taste. Yes, there is a word or two in the related work, but also it looks like some wheels are being reinvented.

The subscriptions model reminds me of Esterel, which is imperative and uses await/emit pairs.In the domain of GUIs, which is related to Elm, there is ReactiveML (see "ReactiveML, Ten Years Later" (https://www.di.ens.fr/~pouzet/bib/ppdp15.pdf)).Look also at Lustre or Signal, with Signal allowing directives to define multiple execution units: this is used to generate concurrent programs exchanging values with message passing.

The domain is different, though. Synchronous languages do not target web interfaces. They are about embedded systems and as such, they are mostly static. On the other hand, they are compilable into a simple event loop with static memory usage and constant execution time. Maybe some of the existing research could be useful to something like Elm, even if it does not target the same problems.

zalmoxes 5 days ago 3 replies      
As someone who is just getting started with frontend development, I decided to go with Elm instead of learning JS and React. I found the whole experience very pleasant, even as a beginner.

Not only is the elm code I write reliable, but I've found that adding more features does not bloat my code. Refactoring a codebase as it grows in elm is pleasant, and following the Elm Architecture guides me on the correct structure for the app.

Over the weekend I made a small site to show all Elm conference videos in one place. If you want to play around with 0.17 this project is just a bit above a "Hello World" example. Send a PR! https://elmvids.groob.io/

e0m 5 days ago 2 replies      
There's also a lot of similarity with Rx.JS and observable based (aka "Reactor") patterns:

 Rx.Observable.fromTime().subscribe(tick) Rx.Observable.fromSocket(mySocket).subscribe(handleEvent)

Regardless of the exact library or pattern, the broader concept of treating data sources as asynchronous event streams you can subscribe and react to definitely simplifies data flow and makes systems very robust.

An additional benefit of this pattern is the natural way it makes it easy to filter, chain, map, etc onto these subscriptions. Once again from the Rx world, http://rxmarbles.com/ does a great job visualizing these patterns.

timroy 5 days ago 3 replies      
This looks very cool.

In ClojureScript, we have the re-frame pattern/framework, which is built on Reagent, which is a ClojureScript wrapper of React.

re-frame is all about subscriptions, using a "big atom" to hold application state client-side. Seeing Elm implement the same subscription pattern makes it look pretty tempting.

My understanding is that ClojureScript and Elm have some similarities - functional, pleasant to work with - with one significant difference being that Elm is typed.

salimmadjd 5 days ago 0 replies      
Congrats to Evan. I'm still learning Elm and it has been such a pleasure. It's so much clearer to understand the code and go back to it later.I spoke with Evan at length at one of the regular Elm "hackathons" in SF and was so impressed how he thinks about long-term vision of Elm and prefers to take his time to ensure he gets it right to make Elm be around for a long while. Which is the reason I'm investing time into Elm. If you're in SF Bay area. Tomorrow (Wed) is the next Hack night a great place to talk to Evan and the small community around it.
Keats 5 days ago 3 replies      
I like Elm but its development really seems to rely on a single person. I saw this post on elm-discuss which is a pretty good summary of my thoughts https://groups.google.com/forum/#!topic/elm-discuss/AmxE3qAm...
malandrew 5 days ago 2 replies      
Since people in this thread are likely to be elm enthusiasts and know what's going on in the ecosystem, what is the largest high quality elm app you know about and what is the largest high quality elm app that is open-source that you know about?

I've been doing backend work for a while and I'd like to see what is possible these days with elm.

knewter 5 days ago 0 replies      
I'm super excited for this change. I've been watching and waiting for a few weeks. It's a big ordeal for me personally, because I've got 10 weeks of daily elm tutorial content I've written so far at http://dailydrip.com/topics/elm that I'm now re-writing, but the changes are all for the better and the language adoption is bound to go up over time.

If you haven't yet tried Elm, give it a shot. A co-worker (hi Heath!) showed it to me 4(?) years before I got interested in it, and I brushed it off as a toy because I was against Functional Programming at the time for terrible reasons. It's actually one of the 'this feels like the future' technologies that's re-shaping how I think about programming these days. Huge kudos for the release!

zachrose 5 days ago 5 replies      
I haven't dabbled in Elm much, but subscriptions look a lot like ordinary JS event handlers:

 Time.every second Tick

 Time.on('everySecond', tick)
Beyond baking an event emitter into the Time module and having a nice looking API, is there something I'm missing?

nilkn 5 days ago 7 replies      
What I really want is a language with Elm's type system, simplicity, and syntax, but aimed for backend software instead of HTML apps and with strong support for an Erlang-style actor model of multicore, distributed concurrency. Basically something like Elixir, but with Elm's syntax and type safety. In the meantime Elixir will do for me, but I'd really like more type safety without going full Haskell.
krisajenkins 5 days ago 0 replies      
This is marvellous. IMO Functional has always been a bigger deal than Reactive. Great to see more of that pursuit of simplicity paying off.
rdtsc 5 days ago 0 replies      
There is a nice talk from Erlang Factory 2016 called

"Making the Web Functional with Phoenix and Elm"


pklausler 5 days ago 1 reply      
Stupid question from an interested outsider: how's a "subscription" different from a good ol' callback?
zellyn 5 days ago 3 replies      
Just out of curiosity. I'm trying to decide which client-side framework/language to dive into, and it seems that many people consider Om Next a really nice step forward, away from "giant blob of state". My (uninformed) observation is that Elm does things the way Om (previous) did it: is that correct? Is Elm aiming to incorporate Om Next's advantages?
TY 5 days ago 0 replies      
Say what you will about languages that compile to JS, but Elm code looks so elegant that I just want to use it for aesthetics alone. Major kudos to Evan for such a wonderful and powerful creation.
skybrian 5 days ago 0 replies      
From the Lucid Synchrone paper referenced in the article:

"Synchronous languages are based on the synchronous hypothesis. This hypothesis comes fromthe idea of separating the functional description of a system from the constraints of the architecture on which it will be executed. The functionality of the system can be described by makingan hypothesis of instantaneous computations and communications, as long as it can be verifiedafterward that the hardware is fast enough for the constraints imposed by the environment."

That sounds a lot like how functional programming works in a browser. If you assume function calls take zero time (or can be optimized so they're fast enough) then you end up with event handling and asynchronous I/O. Preemptive multitasking becomes a non-issue. But long-running computations (where you actually want to keep some CPUs busy for a while) need to be handled outside the system.

dreamdu5t 5 days ago 1 reply      
Looks like Elm moved closer to Pux's API by dropping the signals for `Html action`. Check out Pux if you're interested in the same architecture but for PureScript.


stepvhen 5 days ago 1 reply      
I got into studying FRP around 3 years ago for my senior project. Aside from a decent, clear explanation, there were hardly any actual implementations of this allegedly good paradigm (something Evan covered in his thesis, and the basis for Elm). It seemed like a paradigm that was full of promise and potential but failed to deliver in any worthwhile way.

(Such was the case for my senior project, studying the viability of Arrowized FRP in Elm. In short, I concluded that it was nothing but hell and nobody should bother.)

I am happy to seem Elm drop FRP, even if I wished it could be the savior of the method. At this point I think it's a troubled concept and should be limited to old theses.

iamwil 4 days ago 1 reply      
I found that while things worked once it compiled, I spend most of my time staring at compilation errors, especially as I refactored. So I'm mostly trading time spend debugging subtle state errors with fixing compile type errors. It feels not as productive, since I'm usually looking at errors.

Also, I'm still new to FP, so if you go outside of the Elm architecture, you're going to run into problems, which you're use to solving with specific tools, but they won't be available. So you'll have to spend time learning how to compose things in a new way.

wyager 5 days ago 1 reply      
Elm is very cool and pleasant to use compared to JavaScript, but I have some reservations. The limited typeclass system (for numbers and such) seems... questionable. The interaction of what looks like pure declarative syntax and what is actually impure imperative semantics is confusing to me. I understand that the author wants to avoid monads and other perhaps somewhat confusing staples of functional IO, but I'm not sure the cost/benefit of doing it this way works out.

I will say that Elm is the best JavaScript-targeting platform I have tried! I have hopes for GHCJS, but it's not near Elm's level of readiness yet.

vvanders 5 days ago 0 replies      
Woah, still reading but seems like a big change considering FRP was somewhat fundamental to Elm.

[edit] Looks like a solid change driving Elm towards easier usage, Signals and Mailboxes were definitely something that took a while to wrangle correctly.

sotojuan 5 days ago 0 replies      
Elm has always been "that language I dabble in every now and then but never have time for"... good to see it's evolving. Evan and Richard (creator and main evangelist, respectively) are a great team and I hope them the best (and I hope I get more time to mess around with Elm!).
daxfohl 5 days ago 1 reply      
Wow, this is a surprise. Kind of like a Haskell headline "A Farewell to Immutability". But sounds like the pros and cons have been well thought out for the intended use case.
d0m 5 days ago 2 replies      
One thing I'm wondering with elm is how does it deal with state across different components? I.e. where would you put the settings of the current user and how would you access that within one component? (By component I mean the (init,update,view,subscription))
wrong_variable 5 days ago 0 replies      
subscriptions are great for handling black hole callbacks - callbacks that you cannot return back control to the callie.

Examples of this would be web-socket , http-requests , etc.

however elm and cycle.js try to shoe-horn this idea everywhere - which I think is not needed.

Not every event needs to be subscribed to - it pollutes your global stream. Use global stream only when you need to deal with things that move out the execution context of your program.

gjolund 5 days ago 1 reply      
Doing away with signals is a great move, everything else about the elm architecture felt so intuitive.

I'm excited to give elm another shot, and I yearn for the future he describes where web assembly makes elm a viable js replacement.

mswift42 5 days ago 5 replies      
Does Elm have any nice interfaces for common JS frameworks? (like reagent / om in clojurescript, or angular2 for dart / typescript)
john-kelly 5 days ago 0 replies      
Hey! CodeHS is working on an Elm Course. The vision of the course is to make a more approachable version of SICP.

The course is still in development, but let us know if you're interested!


jweir 5 days ago 1 reply      
Has anyone upgraded a non-trivial app from 0.16 to 0.17? Is it pretty simple to rewrite moving from Signals to Subscriptions? Any advice?
jtwebman 5 days ago 0 replies      
I love Elm before but now it is even better! I'll make the switch even though I was almost done with my first app with it.
empath75 5 days ago 2 replies      
Is parsing json still an ugly mess? That was what put me off actually using it for anything.
charlieflowers 5 days ago 1 reply      
So, is it still true to say, "You wish you could express your business logic with pure functions. But often you can't because the lack of immutability hurts performance in various ways. But ELM creates a 'sandbox' in which you can do so, by letting you write functions which work over time varying streams instead of stateful callbacks."

If not, which parts changed / how would you revise it?

creshal 5 days ago 1 reply      
> A web handler threw an exception. Details:

> gen/pages/blog/farewell-to-frp.html: getFileStatus: does not exist (No such file or directory)

Some caching problem, it seems?

acqq 4 days ago 0 replies      
For those like me who aren't familiar with Elm:

An Introduction to Elm


mhd 5 days ago 2 replies      
Has the elm installation on Linux improved in recent months? Last time I tried my system's GHC was too new to install it from source and the node installation went kablooey in weird parts(too recent ncurses, the automatic "reactor" browser repl not finding the Html package etc...)
rtpg 5 days ago 0 replies      
The chat client thing is pretty interesting, but there's this little bit:

> Use the browser API To get started you just create a new web socket! Well, then you need to open the connection. But do not forget to add an onerror listener to detect when the connection goes down and try to reconnect with an exponential backoff strategy. And ......

How does handling errors happen I wonder. If you take FB messenger for example, you would queue up a message but if the sending failed you would get an opportunity to retry/not send it at all.

I suppose in FB's case you could write your own subscription provider...

tunesmith 5 days ago 1 reply      
Say a magazine company wants to regularly send issues to a customer, and the customer signs up. Who has the subscription, the customer, or the magazine company? Which one subscribes?

According to Rx, it's the magazine company. Which one subscribes? The magazine company subscribes. It took me a while to realize this when trying to learn Rx concepts, which made it really confusing, since I've always seen the customer as being the subscriber, subscribing and owning the subscription.

It looks like Elm is the other way around compared to Rx, I think.

ryenus 5 days ago 0 replies      
FRP = Functional Reactive Programming

> When I started working on my thesis in 2011, I stumbled upon this academic subfield called Functional Reactive Programming (FRP). By stripping that approach down to its simplest form, I ended up with something way easier to learn than similar functional languages. Signals meant piles of difficult concepts just were not necessary in Elm.

avindroth 5 days ago 0 replies      
I have been learning some haskell, maybe I will dabble in elm. Watched a talk by Evan and was blown away by time-based sophistication.
hbrid 5 days ago 0 replies      
Looks very good. I got stuck learning elm at about the point I wanted to integrate a call to a javascript function. Can't for the life of me remember if that involved signals or not. Will definitely try the tutorial(s) again when I get some more time because I really was enjoying everything else about the language and developing in it.
kgr 5 days ago 0 replies      
FOAM is also subscription based, but then builds FRP on top of it:https://www.youtube.com/watch?v=-fbq-_H6Lf4
millstone 5 days ago 0 replies      
Can someone explain how the subscription API differs from signals, addresses, and ports? I'm not familiar with the "old" Elm so I can't judge what the change is like.
leejoramo 5 days ago 0 replies      
I wish Elm's blog had an RSS feed.
Learn2win 5 days ago 1 reply      
Are there type classes in Elm?
miguelrochefort 5 days ago 0 replies      
Congrats, you just invented Reactive Extensions...
Artem vs. Predator ribbonfarm.com
644 points by sharpn  2 days ago   66 comments top 25
micheljansen 2 days ago 1 reply      
A cool property of NIR filters is that they are able to produce a relatively recognisable picture of the world with all screens (TFT, projected) filtered out. I've used this in the past for computer vision to do object recognition on top of a projected display without having to programmatically filter out the projection.

It's also pretty easy to make a rudimentary NIR filter by layering a red, green and blue filter on top of each other: the resulting filter will not allow any visible light through (e.g. red, green or blue), only the "rest".

edit: and they make your eyes look really creepy too (not me in the picture btw): https://flic.kr/p/6CYzDZ

dharma1 2 days ago 0 replies      
Not just heat, there is a lot of interesting spectral data we don't see. Hyperspectral cameras are rare and expensive right now - but I'm really looking forward to the day they are cheap and commonplace




monk_e_boy 2 days ago 2 replies      
I assume the distortions at the edge of the 'photos' is due to the X Y scanning rig. Moving the sensor around the edge of a sphere (on a curved X Y (and a little bit of Z) rig, the focal point being the center of the sphere) would sort that right out.

I expect the 3D printer could make a curved rail for the rig.

Or you could point the sensor at the center of the pin hole or lens using a mechanical linkage, then use some software to warp the resulting image.

gregorkas 2 days ago 1 reply      
You know those black and white photos? I thought Russia looked like that with regular cameras :P.

Just kidding, this has to be one of the coolest projects I've ever seen. To make all this by yourself is just ... awesome.

SamBam 2 days ago 1 reply      
Forgive my ignorance, but if this system would require liquid nitrogen cooling to detect heat from a human, how do commercial (now almost cheap) thermal cameras work? Are they not sensing IR? The can certainly distinguish the different temperatures of different people, without needing to be cooled themselves.

How is this system better than a commercial IR camera -- besides the very obvious and valid reason that this is DIY, and started long before those were cheap?

> "The most amazing part is not that it glows, but that it glows brightly enough to illuminate the stand. Its not just the temperature mapped to an image of a regular heat vision camera, we see the actual long-wave light being emitted and reflected a soldering iron turned into a lightbulb!"

How is that different from this image, which shows a duck's IR being reflected by water, made by a cheap-o "regular heat vision camera"? http://thermal-imaging-blog.com/wp-content/uploads/2012/05/d...

sbierwagen 2 days ago 0 replies      

 This was my first encounter with ITAR the how dare you want interesting stuff? restrictions. Before then I never realized just how USA, uh, loves the whole world.
ITAR is a US regulation, and the USA doesn't have jurisdiction on a sale from a Japanese vendor to a Russian customer. You're thinking of MTCR, "an informal and voluntary partnership" between 34 countries. Both Russia and Japan are members: https://en.wikipedia.org/wiki/Missile_Technology_Control_Reg...

ommunist 2 days ago 0 replies      
Great read. Tireless enhancements until goal achieved. Great job.
dmvaldman 2 days ago 1 reply      
"..when I started there were no 3D printers. Eventually I made one, and all the clumsiness of the old rig got replaced with modern, 3D printed, well-fitted parts."

When I read this I thought, "nah"... but some digging later and


fitzwatermellow 2 days ago 1 reply      
Nice way to start the day ;) Can't someone send this guy a box of gently used CCDs and some wideband lenses already or maybe an old copy of labview...

Oh and Artem, if you happen to see this, can you try getting out of the city on a cloudless night and taking a long exposure of the night sky? Thanks for your hard work and inspiring us all!

taneq 2 days ago 2 replies      
Great read! I'm curious about the last bit, saying you need liquid nitrogen cooling to detect the radiated heat from a human. I'd always thought some snakes have infrared-sensitive spots near their mouths that could sense mammals from a distance, are they just that much more sensitive than a photodiode? Or are they sensing something else?
alphapapa 2 days ago 2 replies      
So is the future wavelength project something like radio waves? It would be fascinating to see an image lit by radio waves. Imagine a wifi router being lit up like a lightbulb, casting shadows around a room, or a room lit by FM radio passing through walls and windows, etc.
Terr_ 22 hours ago 0 replies      
> Heat vision. [...] MWIR is neither heat, nor is it light. Its both

No, it isn't. The "infrared light is heat" statement is casually-useful but technically-false, which means it backfires and harms people when they start learning more about physics.

IR radiation is not the intrinsic heat-content of the matter emitting it. Infrared is simply a particular set of wavelengths which happens to be prominent for temperatures which are "hot" for us just because we evolved on this small rocky planet which has a certain baseline temperature.

Go ask some icy crystalline aliens or ones powered by low-grade nuclear reactions what they think about "heat radiation" and you won't get the same answer.

sangnoir 2 days ago 2 replies      
> But for our goal its important to remember, because infrared light does not pass through glass

I thought that was UV?! If that statement is true, how on earth am I able to busk in the heat of the sun behind a glass window, indoors?

paulgerhardt 2 days ago 1 reply      
Infrared film used to be available commercially with sensitivity up to 900nm. It was quite useful for aerial photography work forestry, surveying, spying. Very little of it is still around. Kodak Ektachrome was what one could get for your high school dark room. Aerochrome for surveying work. It was discontinued in 2009. Ilford however still makes some. Be forwarned, finding someone to develop the stuff is a nightmare, the chemicals are toxic, and the shelf-life brief.

Scientific infrared films, such as special formulations of AeroChrome I, II, II, approached sensitivity up to 1200nm.

In surveillance work, objects which were painted to look like their natural environment using various organic or inorganic paints may show up quite differently in the infrared spectrum.

In forestry work, old growth tree populations could easily be distinguished from new growth tree populations and were one of the primary uses for Nasa's version of the U2 (ER-2) for identifying old-growth redwood populations in northern California. [1]

A lot of work was done in the 1970's and '80's by astronomers and physicists to 'hack' Eastman Kodak scientific film, or plates as they were called. (Once you move past "point and shoot" film, you get into the realm of plates, 4"x5" trays similar to old-timey 1880's cameras.) Things like Kodak I-Z. One technique was to hypersensitize the film by bathing it in Ammonium Hydroxide [2]. Lawrence Livermore had such an appetite for IR-sensitive film with their laser work that they set up their own production process for hypsersentizing Kodak scientific plates. Another was to supersensitize them with acetic solutions getting film sensitivity in the >1500nm range [3]. This seems to be the limit of our knowledge for traditional chemical film processes.

Modern DSLR's have sensitivity up to 1600nm. Nikon worked with NASA for some of their special DSLR's [4].

One of the cooler things I saw was a University of Florida paper in Nature that used IR-OLED's to upconvert IR to visible light through a lens adapter achieving sensitivity from 400nm to 2000nm [5].

Beyond 2000nm you get into the MWIR range and FLIR devices take over.

[1] https://books.google.com/books?id=HZUTCgAAQBAJ&pg=PT129&lpg=...

[2] http://www.osti.gov/scitech/servlets/purl/4442636/

[3] https://books.google.com/books?id=nlftCAAAQBAJ&pg=PA259&lpg=...

[4] http://eol.jsc.nasa.gov/Collections/NearIR/IR_Intro.htm

[5] http://www.nature.com/articles/srep05946

stared 2 days ago 2 replies      
It is possible to see in near infrared - just use a IR filter (used for photography), and stick it to your eye. Everything should look dim and reddish, but what is characteristic (and different from a red filter) is that leaves are very bright. Also, many black t-shirts are not-dark.

Also, it's fun to make such photos:


IamFermat 1 day ago 0 replies      
For 21st century Predator vision, this is the closest to it. Thermal IR from FLIR - https://www.youtube.com/watch?v=L7URETAl75A. Someone should hack this into a pair of glasses.
spatular 2 days ago 1 reply      
Have you tried zone plates instead of lenses? Feature sizes should be larger for IR than for visible light, so it's should be possible to fit enough zones to get some focusing.
ivanb 2 days ago 0 replies      
This guy is spectacular. Check out his other work: http://orbides.org/ Here is a nice concept of an interstellar ship http://orbides.org/concepts.php?lng=eng
aavotins 2 days ago 0 replies      
Incredible dedication and a fantastic project. I wonder if a setup made from a FLiR Dev Kit, a Raspberry Pi and the Pi camera would produce similar result?
haddr 2 days ago 0 replies      
Cool stuff!

I think that regarding the near infrared light a regular CCD element could be useful as well, as those are sensitive to infrared too. The only problem is to filter the visible light to leave the infrared spectrum only. Also be sure that the ccd doesn't have a IR filter (some does).

t_fatus 2 days ago 2 replies      
Never Give Up.. This should have take SOOOOO long, with so many disaopointment ! Really nice job
ck2 2 days ago 0 replies      
People who dig to understand and create stuff like this are just amazing.
vvanders 2 days ago 0 replies      
Now that's real engineering.
muloka 2 days ago 0 replies      
This was a fascinating read. Thanks for sharing.
mmastrac 2 days ago 2 replies      
This is a great story, but the original title of "Artem vs. Predator" is much better than the submitted one.
Hidden Microphones Part of Government Surveillance Program in the Bay Area cbslocal.com
556 points by randomname2  1 day ago   255 comments top 43
maxxxxx 1 day ago 10 replies      
If that's true we are moving into a scary world pretty much like 1984 where everything you do is monitored all the time. Add voice, face recognition, surveillance drones, cheap data storage and AI to the mix it seems pretty soon everything you do in a public place will be recorded and analyzed. And all this can be done fully automated.

I hope people will recognize that some laws need to change. Surveillance makes sense in some cases but it should be hard and expensive to do.

s_q_b 17 hours ago 2 replies      
Whoa, whoa, whoa. This is a very misleading headline.

The article contains no evidence of a government-operated network of listening microphones in the Bay area. It deals with one incident using a specific piece of surveillance gear.

What we actually know is this:

An FBI agent was attempting to bust two individuals suspected of a bid-fixing scheme.

This FBI agent then placed two recording devices in a light fixture and another at the bus stop nearest the courthouse.

After the surveillance recorded enough incriminating evidence, the FBI/DOJ sought to introduce the recordings at trial.

All else was rank speculation by a former FBI employee named Jeff Harp.

He also stated, "If youre going to conduct criminal activity, do it in the privacy of your own home... that was the original intention of the Fourth Amendment," which makes me question his grasp of the constitution, and thus his reliability as an expert.

Mendenhall 22 hours ago 4 replies      
Hmm I wonder how this would work considering its all things openly available to be recorded.

Follow a federal agent home from work and film them and their house. Film their children going to school and post pictures locations and times online, also taking photos of other relatives, friends etc and where they live and their phone numbers and post it all online.

Have your friends join in and keep posting every detail and picture you can get of federal agents and their families etc and compile it into a website.

Then let me know how that works out.

Edit: Does anyone know of any USA law that stands against such activity?

bobhaigler 1 day ago 1 reply      
It's not the best reporting, but from what I can tell, the FBI is collecting evidence against auction bidders who allegedly conspire to pay less for foreclosed real-estate. The auctions take place on the courthouse steps, and the bidders have been accused of working together to keep prices low by intimidating some bidders and conspiring with others. I'd love to see the auctions moved to a website marketplace. Maybe a good startup idea?
alva 1 day ago 2 replies      
The frequency response of most microphones (before DSP filtering) does not perfectly cut-off above 20 khz. I suppose if you were really paranoid you could walk around blasting exceptionally loud noise around and above this freq for privacy ;). Although your local dogs might not be too happy
mike_hock 1 day ago 11 replies      
> If you're going to conduct criminal activity, do it in the privacy of your own home.

> That was the original intention of the 4th amendment.


The intention of the 4th amendment was to enable criminal activity, but confine it to private property?

Not to protect citizens' privacy from government snooping without probable cause?

The intent was to prevent the executive branch from indiscriminately "searching" (and hidden microphones are a form of search) innocent people without any cross-check from the judicial branch. The fact that you can "conduct criminal activity in the privacy of your own home" is a side effect, not a design goal, of the principle that a free society is worth the risk of letting a few criminals slip through the cracks.

fosco 10 hours ago 0 replies      
So what is going to be done about it? with each revelation that has no riot it looks more and more like the average person is silenced into submission to just 'deal with it' rather then correct it.
bogomipz 8 hours ago 0 replies      
You don't even need to be clandestine about it though. Why hasn't this story hasn't gotten more traction? Where's the outcry?


bitL 1 day ago 0 replies      
Now I am finally convinced that Internet of Things is going to be the next big thing - Internet of Real-time Microphones...
srtjstjsj 22 hours ago 0 replies      
The quotes from the former FBI agent, showing complete disregard for the 4th Amendment, send a clear message about how deep illegal behavior runs in the FBI.
peter303 10 hours ago 0 replies      
Snowden suggests that the microphones and cameras on your cell, landline, desktop can be remotely activated any time by the government or hackers.
cs2818 13 hours ago 0 replies      
To me this goes back to the issue of having a reasonable expectation of privacy.

As others have pointed out, if I am in a public place and someone snaps a photo of me or a surveillance camera records my actions, I generally can't argue this was an unreasonable invasion.

I'm certain legal scholars are well-versed in the technicalities of expectations of privacy, but I do wonder if we will face new challenges in this area as the number of pervasively connected and recording devices increases.

solutron 22 hours ago 0 replies      
This is a more informative article around the topic IMO:


srtjstjsj 22 hours ago 0 replies      
Previously covered on HN:https://news.ycombinator.com/item?id=10582392

Defense Claims Courthouse Was Illegally Bugged (therecorder.com)171 points by morninj 179 days ago | 100 comments

FollowSteph3 13 hours ago 0 replies      
If this were true then how do you explain all the crime that still happens. Even high profile cases. It would be a lot easier to manager bigger crimes. But since we still see them struggling with even big profile cases, where not having an arrest is embarrassing, it's extremely hard to imagine this being true.
bunkydoo 23 hours ago 4 replies      
If I really gave a damn, had 1000 bucks to blow, and had no life I could go plant hidden microphones, pinhole video cameras, and more in just about every conceivable public place too. If anyone argued or brought me to court - I could demonstrate how this is absolutely no different than everyone "secondhand recording" me in their little snap stories and instagrams which do go online forever. My point is, this is some small potatoes 1980's style shit that is nothing compared to NSA Echelon (which I find to be actually a fascinating technology if it weren't so malleable to abuse by human nature) If Echelon were an independently operating AI system that had algorithms that identified criminal activity, then used soft paternalism in the form coercing unaware law enforcement personnel into making "chance" encounters with would-be criminals (random traffic stop, or passing them by on the street), the psychology of the criminal would be to stop doing whatever their doing because they feel more watched when they are doing said crime; although the law enforcement agent may never be directly aware of the criminals communications or conduct. I can deal with surveillance all day, no sweat. Totalitarian censorship is my problem, and that doesn't appear to be happening stateside.
sdrinf 1 day ago 2 replies      
Do we have secondary sources confirming this, specifically, lawyers in the case confirming this being brought up as evidence? If so, where is the public record of that?

What are CBSLocal's incentives? Can they push fearmongering without repercussions?

If this story is not true, where should we put the "flag" threshold? What are operational voting with money against stories like this ever, ever making HN frontpage again, and for the "news" industry to structurally reduce unfounded fearmongering?

em3rgent0rdr 23 hours ago 0 replies      
Even if judges prohibit such evidence on the grounds of it being illegally obtained, you will still find that prosecutors will use that evidence to find other evidence and for constructing a case that doesn't use that illegally obtained evidence. https://en.wikipedia.org/wiki/Parallel_construction
dredmorbius 1 day ago 0 replies      
Correlated story: Google's Parsey McParseface.
Gratsby 22 hours ago 1 reply      
Haven't you guys ever seen The Wire? Is this seriously a surprise to anyone?
matthudson 1 day ago 0 replies      
Call to action: Machine learners, you'd better start tuning your sarcasm classifiers.
kabdib 13 hours ago 0 replies      
I'd love to find a sample -- how do they communicate? Wirelessly? Betcha they can be fingerprinted
swayvil 1 day ago 0 replies      
How much do you suppose these microphones are worth?

Also, what's an easy way to locate them?

daveheq 12 hours ago 1 reply      
This clickbait article is going to cause people to think the government was just spying on everyone everywhere for no other reason than just to spy on you. Another "distrust the damn government" article that further divides people against others.
xufi 21 hours ago 2 replies      
As this seems to come up more and more. I can't fathom why most people can't get what's being fed to them by our media sources. http://craigbhulet.com/10%20Mega-Corporations.jpgThis chart albeit a bit big of a image demonstrates which media entities are owned/connected to whom. Pretty interesting
tempodox 1 day ago 0 replies      
> ...if youre going to conduct criminal activity...

The actual scandal here is not the potential danger to criminal enterprises but that we're all being made suspects beforehand 24/7.

nxzero 1 day ago 2 replies      
When will people say enough is enough?
justifier 23 hours ago 2 replies      
huh? but ca and sf is two party consent?


enlightenedfool 1 day ago 0 replies      
So, can they just associate a voice to a person legally with those random recordings?
joantune 16 hours ago 0 replies      
it might come a time in the future where we think that wearing a tin foil hat might be the reasonable thing to do :D
yunesj 1 day ago 0 replies      
KPIX5's Jackie Ward said it's legal, so the defense can withdraw the motion.
pteredactyl 21 hours ago 0 replies      
I really did not need to read this right now...
carapace 8 hours ago 0 replies      
"Total Surveillance is the Perfection of Democracy"

(This is an old blog post I wrote that seems relevant to this. Please forgive me for reposting it whole here but I really REALLY want to hear what HN readers think. (And no one sees it on my blog. heh heh))

For once I disagree with RMS, re: https://www.gnu.org/philosophy/surveillance-vs-democracy.htm...

I believe that it is fundamentally not possible to "roll back" the degree of surveillance in our [global] society in an effective way. Our technology is already converging to a near-total degree of surveillance all on its own. The article itself gives many examples. The end limit will be Vinge's "locator dust" or perhaps something even more ubiquitous and ephemeral. RMS advocates several "band-aid" fixes but seems to miss the logical structure of the paradox of inescapable total surveillance.

Let me attempt to illustrate this paradox. Take this quote from the article:

 "If whistleblowers don't dare reveal crimes and lies, we lose the last shred of effective control over our government and institutions."
(First of all we should reject the underlying premise that "our government and institutions" are only held in check by the fear of the discovery of their "crimes and lies". We can, and should, and must, hold ourselves and our government to a standard of not committing crimes, not telling lies. It is this Procrustean bed of good character that our technology is binding us to, not some dystopian nightmare.)

Certainly the criminally-minded who have inveigled their way into the halls of power should not be permitted to sleep peacefully at night, without concern for discovery. But why assume that ubiquitous surveillance would not touch them? Why would the sensor/processor nets and deep analysis not be useful, and used, for detecting and combating treachery? What "crimes and lies" would be revealed by a whistleblower that would not show up on the intel-feeds?

Or this quote:

 "Everyone must be free to post photos and video recordings occasionally, but the systematic accumulation of such data on the Internet must be limited."
How will this limiting be done? What authority will decide who gets to post what and when? And (like any profanity filter) won't this authority need to see the content to be able to decide whether it gets posted publicly?

In effect, doesn't this idea imply some sort of ubiquitous surveillance system to ensure that people are obeying the rules for preventing a ubiquitous surveillance system?

Let's say we set up some rules like the ones RMS is advocating, how do we determine that everyone is following those rules? After all, there is a very good incentive for trying to get a privileged position vis-a-vis these rules. Whoever has the inside edge, whether official spooks, enemy agents, or just criminals, gains an enormous competitive advantage over everyone else.

Someone is going to have that edge, because it's a technological thing, you can't make it go away simply because you don't like it. If the "good guys" tie their own hands (by handicapping their surveillance networks) then we are just handing control to the people who are willing to do what it takes to take it.

You can't unilaterally declare that we (all humanity) will use the kid-friendly "lite" version of the surveillance network because we cannot be sure that everyone is playing by those rules unless we have a "full" version of the surveillance network to check up on everybody!

We can't (I believe) prevent total surveillance but we can certainly control how the data are used, and we can certainly set up systems that allow the data to be used without being abused. The system must be recursive. Whatever form the system takes, it shall necessarily have to be able to detect and correct its own self-abuses.

Total surveillance is the perfection of democracy, not its antithesis.

The true horror of technological omniscience is that it shall force us for once to live according to our own rules. For the first time in history we shall have to do without hypocrisy and privilege. The new equilibrium will not involve tilting at the windmills of ubiquitous sensors and processing power but rather learning what explicit rules we can actually live by, finding, in effect, the real shape of human society.

dude3 19 hours ago 0 replies      
And what are WE going to do to stop it...?

They aren't going to be listening to the comment sections of websites.

MaSk3d 10 hours ago 0 replies      
Just support each other. ;)
Zigurd 1 day ago 1 reply      
He is supposedly the manager of Autodesk's "Global threat management program." Do their "global threats" amount to more than watching over the cars in the parking lot at night?
neat159 14 hours ago 0 replies      
can't believe it
peterwwillis 1 day ago 0 replies      
This sort of public spying has been documented since at least 1959.

This _particular_ public spying was reported at least since 2012.


"In July 1956, the Pennsylvania Bar Association Endowment (PBAE) commissioned a comprehensive study of "wiretapping practices, laws, devices, and techniques" in the United States. [..] The man appointed to direct the study was Samuel Dash, a prominent Philadelphia prosecutor [..] The result of Dash's efforts was The Eavesdroppers, a 483-page report [..] The book uncovered a wide range of privacy infringements on the part of state authorities and private citizens [..]

While law enforcement agencies were tapping lines in flagrant violation of state and federal statutes, phone companies were deliberately underreporting wiretap statistics to maintain public confidence in their services. While American businesses were stockpiling equipment to spy on employees and gather competitive intelligence, private investigators were using frightening new tools to listen in on wayward lovers and loose-lipped politicians."

http://www.infowars.com/spy-grid-can-now-record-your-convers... 2013)

"The Washington Post recently published a feature length article on gunshot detectors, known as ShotSpotter, which detailed how in Washington DC there are now, at least 300 acoustic sensors across 20 square miles of the city, microphones wrapped in a weather-proof shell that can detect the location of a sound down to a few yards and analyze the audio using a computer program.

, , , . "


This quote from the 1959 report sums up everything nicely:

"As Dash told members of Congress on the eve of the book's release, the American public's longstanding disregard for threats to communications privacy had only served to exacerbate these developments. [..]

'Each generation seems to forget the problems of the past and considers this their own unique problem.' - Samuel Dash"

awqrre 1 day ago 1 reply      
godgod, your comments are being hidden and I am not able to upvote or reply to it ...
jamesdwilson 1 day ago 0 replies      
> Whoever made this site should be dumped off the bridge in the background..

What exactly did you mean by that? CBSLOCAL.com editor?should be murdered? why?

dang 9 hours ago 0 replies      
That crosses the line into personal attack, which is not allowed here regardless of how wrong someone's views may be.

We detached this comment from https://news.ycombinator.com/item?id=11699424 and marked it off-topic.

ccarter84 1 day ago 0 replies      
My phone keeps getting unavoidable pop-ups, I'm sure it's a great story though.
jondubois 1 day ago 3 replies      
Ok, so these facts are true:

#1. The government is monitoring us.

#2. Facebook, Google, Amazon, Apple and many other huge corporations are also monitoring us... And they are sharing our personal data with each other.

Why are so many news articles about issue #1 and so few are about issue #2. The government is a negligible threat compared to corporations. It seems like tech companies are manipulating the media to turn people against the government and divert attention away from themselves.

Tech layoffs more than double in Bay Area mercurynews.com
488 points by akg_67  3 days ago   340 comments top 35
ChuckMcM 2 days ago 10 replies      
That layoffs have increased should not surprise anyone at this point. Unlike the dot com explosion it seems painfully difficult as a regular employee to have divested some of your equity gains mid-bubble. That makes it more painful.

However, every time there is a great deflating, it is because the market is tired and preparing to embrace something different. So far I've been through several of these, chips in the 80's dot coms in the late 90s, storage in the early 2000's, and now either web 2.0 or social (depending on how you score it). Three threads are competing for the next round, IoT, Machine Learning, and Bioinformatics. CRISPR derived technologies could be in there too but I see a lot of regulatory hurdles headed that way which will be hard to dodge.

The thing to keep in mind though if you are one of the folks getting laid off, it isn't you its them. Seriously. Layoffs happen when the world shifts and your company didn't shift with it. That said, it still sucks to suddenly be out of a job and the temptation will be to grab at the first thing that offers you something. My advice to you is to be thoughtful. Think through what you want to do, what the world is going through, and how you want to look back at your role on that. Then head for the area that best meets those needs.

tyre 3 days ago 12 replies      
This is a necessary, if painful, lesson.

I remember the layoffs right before leaving LivingSocial in 2012. A raw, human catastrophe. Watching friends pack up their things, who had been sold on the culture and vision of what the company could be, was an eye opening experience.

This is why you focus on business fundamentals. Not because it is sexy, not because you can tell ProductHunt about your run rate, and not because you want impress investors.

Because if you fuck up, everyone who depends on you gets fucked.

A few words of advice:

- Before you raise money, ask yourself if that valuation is realistic or aspirational. If it is aspirational, know that you have to reach that valuation without raising another cent, just to break even. Ask yourself if you are default dead or alive.

- Before you start a company, ask if it is solving a problem for customers or your desire "to be an entrepreneur."

- Before you hire anyone, ask if their position is a need to have or a nice have. Like customers, few businesses can afford "nice to haves"

- There is always capital available for businesses with strong fundamentals who are solving real needs.

Layoffs are like throwing up from drinking. Not fun and nearly always your (the founders/managers') fault.

- Pace yourself.

- Build a company you're proud of.

- Build something that people need.

- Growth is great for slide decks, but employees can't use it to pay rent. Don't chase hypergrowth as an end in itself.

Markets can stay irrational far longer than you can stay solvent, but they can also snap back to reality before you can wake-up.

lanestp 2 days ago 2 replies      
Since I'm old and I went through the .com crash (I didn't get to participate in the boom) I hope my input is valid. This feels like more of a correction than a disaster.

The job market has been super easy and mediocre workers have been commanding excessive salaries because of various socio-economic forces (high valuations and cost of living). At some point there had to be a cull of bad startups and workers. It's unfortunate for the individuals and companies involved but I think in a couple years we will be through this and it won't have been so bad.

thedevil 3 days ago 1 reply      
If I'm understanding the article, there was a net increase of 2400 jobs in the quarter. That's still pretty good. (Edit: it may seem gloomy by comparing to 2015, which seemed like an extremely good year). I googled a little because I was curious and found this:


San Jose #1, that looks good. Although San Francisco....

justinsingh 3 days ago 2 replies      
My dad got laid off from the Fremont Western Digital location just the other day ago. He saw it coming as they were warned layoffs would keep coming (WD has been laying off workers since the middle of last year, I think). So luckily we moved to a nearby city with substantially lower housing prices than in San Jose, where we were spending lots more than we could justify on housing. If we had stayed in San Jose, we would be in some real big financial troubles- pretty much game over. But where we live now (which is actually in a house of the same size, just significantly cheaper) we are living more comfortably and happier without the stress of living in such an expensive city.

Anyone living in an expensive city with a layoff seeming imminent should really be thinking on the long term about where it's best for their family to live. Going day to day wondering if you'll be able to keep your house is not a healthy way to live.

dmode 3 days ago 1 reply      
This article is very confusing to me. If we are concluding that startups are laying off due to funding squeeze, how is that San Francisco actually saw a decrease in lay offs ? SF is where most of the startups are based. Also, it says that layoffs totaled roughly 3000 in 4 months. Which comes down to 750 a month. Compare this to what the article said later that 32,000 jobs were lost per month during the recession. Finally, I couldn't figure how many jobs were added in this period, to get a context of the layoff percentage. Seems a little alarmist
YZF 3 days ago 0 replies      
"Today the Bay Area's total employment of 3,353,600 as of the end of March still reflects job growth, with 102,600 workers added from March 2015 through March 2016."

The sky is not quite falling yet. There are many parts of the economy that are stalling and the impact on that is felt in companies that sell products into those markets. How things will develop is anyone's guess. I feel "Tech" will keep growing though some areas that perhaps were historically considered "Tech" might shrink. If you look at all the great things we can do with technology I think a lot of stuff is still ahead of us. It's just the natural way that things don't always go smoothly.

danso 3 days ago 0 replies      
FYI if you're interested in the raw data from which this story is ostensibly based in part, I've dumped it into this Github repo:


sickbeard 2 days ago 1 reply      
The mobile wave is ending and we are in the middle of the dip preceding the next big "thing". A lot of companies are betting on IoT but these are mostly service companies pushing their own proprietary stacks, which is why I think IoT is a dead-end because there is either no demand for the hardware or the hardware is at a price point that makes no financial sense.
20years 3 days ago 13 replies      
Can any of you from the Bay Area comment on if you have found it harder to find jobs lately? Are recruiters reaching out less than they did last year?

I would be interested in hearing from some of you that have been laid off. Have you found another job in the Bay Area or did you have to leave?

dexterdog 3 days ago 2 replies      
Krikey - Ghostery blocked 21 instances of a detected video player network on that page. What an overloaded mess.
green_lunch 2 days ago 0 replies      
If your company still isn't profitable and needs that next round of VC to survive, be very worried.

I've seen the signs and gone through multiple layoffs over the years.

icdxpresso 3 days ago 5 replies      
How worried should an entry level developer who has been working only 6 months at his current company be? Due to an extremely high cost of living, paying all my bills and student loans and such, I have very little money actually saved up. If I lost my job I'd have to pack my bags and move out of Silicon Valley ASAP. And I really haven't accomplished that much at work during the past 6 months because I joined during holiday season and pretty much spent like 3 months training. I'd be really fucked if I got laid off now.
m1117 3 days ago 1 reply      
We're actively hiring. If you were one of that let go people, please reach out to https://sendsonar.com
client4 2 days ago 1 reply      
While it's sad to see people being laid off, on the bonus side there are jobs elsewhere in the country that could use great engineers. I know in Helena Montana there are a number of companies looking for engineers. It's a great fit for those who enjoy great beer and the outdoors.
lordxenu 2 days ago 0 replies      
A lot of these comments have stated the surface level of what is happening. That is, these companies have failed based on bad business tactics/marketshare, over-valuation, etc... And that the employees being laid off are just the tech-bubble fallout, but will be okay 'because jobs are aplenty' or that they'll just have to reapply themselves in some other job sector.

However, what we don't see is the terms of the layoff that the employees are forced into. Generally, employees don't have a say in what they get in their layoff package, and many times it is not a fair amount in comparison to the work they do and in comparison to what management is getting. In addition, employees generally have to sign a legal document that prevents them from defaming the work practices/management/etc of the company. Either they sign and get their package, or they get nothing at all. I'm basing this on accounts from personal friends who have worked in startups, and although I'm speaking from an anecdotal point of view, I don't see what other data points we can draw from given that no one is allowed to speak out legally.

The bottom line is, we will always be screwed over since we lack basic worker protections. We can talk about bad management and how that can be solved, but ultimately I think each and every one of us need to collectively force a set of protections for workers. Either through law or by forming a strong collective. Until then, this cycle will continue.

motolouda 2 days ago 0 replies      
it's mostly Yahoo at this point, which is no surprise there. We've (I've) also seen bubbles burst a couple times in this industry, and it'll do it again. I wonder how long it will take before we figure out not to throw money at bad ideas, overhire, and value companies at much more than they're actually worth. Eh, probably never. Oh well, viva la recession!
grimmdude 2 days ago 1 reply      
What's unclear to me is if these are all 'tech positions' or just positions at tech companies.
chrisper 2 days ago 4 replies      
So, I am graduating next year and I have no internship experience (wasn't able to get one for this summer for whatever reasons). How doomed am I? Hopefully, by the time I am done, there is still at least one job left :(

Although, I guess finding a job might be easier since I am willing to relocate etc.

imron 3 days ago 1 reply      
Those comments.... wow.
tmuir 2 days ago 0 replies      
How many of the Santa Clara county layoffs were from Intel?
CryoLogic 3 days ago 8 replies      
In Seattle almost no companies have been hiring entry level devs for several months.

Fortune 500 not excluded.

Could it be lay-offs from CA migrating down to WA?

NirDremer 2 days ago 0 replies      
"Ah, I see one can use the "big multiple of a tiny number" trick for bad news too." Marc Andreessen


stephenr 2 days ago 0 replies      
I've only read 2 comments so far, and already my brain is melting from the ridiculous buzzwords.
vasu_man 2 days ago 0 replies      
Keep Calm and Be in Infrastructure
JoshTriplett 3 days ago 0 replies      
For anyone completely unable to view the article because it hangs trying to load: if you have HTTPS Everywhere installed, try disabling the "MNGInteractive" rule.
sjg007 2 days ago 0 replies      
There's layoffs and there's going out of business.
repler 2 days ago 0 replies      
Easiest way to increase profits is to cut some staff.Payroll is a huge expense for just about every company out there.
lsiebert 3 days ago 0 replies      
It might be interesting to consider what layoffs cost companies in terms of confidence and morale.
eruditely 2 days ago 0 replies      
This sort of stuff is going to happen one way or the other, no industry is immune from this.
steven2012 2 days ago 1 reply      
Most of the companies came from older tech companies like Yahoo, Intel, etc which had slow growth anyway so they probably aren't responsible for the vast increase of jobs over the last 5 years.

I'm curious what the layoff situation is like for start-ups though since they don't need to report for the WARN act.

known 2 days ago 0 replies      
tn13 2 days ago 0 replies      
Let us not panic here. Unlike 10 years ago most tech companies are taking more risk, investing more capital and building more products. Obviously they are going to hire more and fire more as some of the projects succeed and some fail ?
BadassFractal 2 days ago 0 replies      
I wouldn't worry about it too much, we have enough unquenchable thirst for filling positions here in the Bay that all of these folks will be quickly and gainfully employed again. One of our local strengths is that the cycle of life here is alive and well. We'll recycle like we always have, nothing goes to waste.
bunkydoo 2 days ago 1 reply      
I don't understand why people feel the need to saturate the bay area to compete for generally sub-par living conditions and only slightly above average pay. Everytime I ask someone why they are living and working there - they answer something along the lines of "to get rich and move out of here" this type of mentality is why the bay is the suicide capital of the world, unfortunately.
Germany plans to remove owner liability for piracy on open Wi-Fi hotspotsreport arstechnica.co.uk
361 points by Tomte  1 day ago   74 comments top 11
denzil_correa 1 day ago 0 replies      
The article doesn't mention that the case was fought in court by German Pirate Party activist Tobias McFadden.


In addition, there is a Freifunk - a non commercial initiative to provide free public WiFi.


germanier 1 day ago 1 reply      
The wording of the proposed law is not published yet and there are good reasons (including statements by the involved ministries) to believe that it does not really have the effect they claim.
Aissen 1 day ago 1 reply      
The situation is similar in France, with the HADOPI(3-strike law) adding a "neglect" infraction for lack of securing your Internet access. So you're indirectly responsible of piracy, but won't pay fines for it, only for negligence of securing your wireless network (ie running an open network). It's pretty comical and hard to apply, and only one person has been sentenced in the 7 years of this law.

Luckily, there's always onionpi (https://learn.adafruit.com/onion-pi/ ) should you need to run an open wifi network.

Pica_soO 5 hours ago 0 replies      
To be perfectly honest- everyone ignored it. That seems to be the role of the lawmakers lately. Make some law to comfort the elderly and ignorant- then don't enforce it and clear up the mess and damages created by the chilling effect.Ironically it wont even save those ISPs it was made to protect for, which paid ridiculosis amounts of money for smartphone frequency's.
awqrre 1 day ago 2 replies      
It's just common sense... Can a ISP be liable for what it's user do? An open Wi-Fi hotspot is basically an ISP....
cm3 1 day ago 0 replies      
It's safe to speculate that the final wording will have limitations and still require identification of users to pursue "pirates", online bullies, online vandals, etc.
ulfw 1 day ago 1 reply      
Years too late. Many years really.
blubb-fish 1 day ago 1 reply      
The infamous CP card is usually played by backwards-minded conservatives but it was, is and keeps being a potential big problem when you open your Wifi to the public. Also other criminal online activities.
abhi3 1 day ago 4 replies      
Is this a common practice in Europe /US? It seems ridiculous that the WiFi provider would be held liable for such a thing. Why stop there, should make the ISP liable too!
codecamper 1 day ago 2 replies      
I think there should be a law that would disallow passwords for public wifi.

passwords are soooo annoying (is that a 1, l, or I?) & if everyone had open wifi, then it could be utilized much more, lowering everyone's wireless carrier usage. also, you wouldn't need to worry about someone using too much bandwidth -- it would happen, but i doubt more frequently than with the passwords. Well, wifi could be throttled if it were really a problem.

george20 1 day ago 1 reply      
You guys surprise me, how is this news? Western societies are supposed to be liberal, how exactly would one be responsible of what other guys do? How about open Wifi in hotels or restaurants?

I am not surprised provided what I hear you say, that Bulgaria (where I live in) has the best internet connection in the world. If you think I am joking - Google it. What the title says is sheer stupidity and makes absolutely no sense - and insult to intelligence.

A cool way to use natural language in JavaScript github.com
481 points by oori  1 day ago   182 comments top 17
oori 1 day ago 4 replies      

 nlp.statement('She sells seashells').negate().text() // She doesn't sell seashells nlp.sentence('I fed the dog').replace('the [Noun]', 'the cat').text() // I fed the cat nlp.text("Tony Hawk did a kickflip").people(); // [ Person { text: 'Tony Hawk' ..} ]

hegivor 1 day ago 3 replies      
Perhaps I am misunderstanding the example but isn't the date parsing result from the API documentation incorrect.

 nlp.value("I married April for the 2nd time on June 5th 1998 ").date() // [Date object] d.toLocaleString() -> "04/2/1998"

guptaneil 1 day ago 3 replies      
Hmm this example is interesting:

 nlp.person("Tony Hawk").pronoun(); // 'he'
I was curious how it handled gender neutral names, since being able to identify gender from a name would be an awesome UX win. I tried a variety of different names, and for gender neutral names (ie "Alex" or "Taylor"), it always picked "he". If it doesn't recognize the name (ie "Alexa"), it returns "they". Unfortunately, it only recognizes very standard American names. Anything remotely ethnic (ie "Anjali") or slightly uncommon (ie "Nate") results in a "they".

Not picking on the library, since this would be an impossible task even for a human, but it seems odd to have used pronoun identification as one of the headline examples.

Anyway, awesome library overall. This could pair well with a dedicated date parsing library like Sherlock[1] to create some pretty cool conversational UI elements.

1: https://github.com/neilgupta/sherlock

rolfvandekrol 1 day ago 7 replies      
Why is NLP these days equivalent of Natural English Language Processing? There are much more languages in the world.
gcr 4 hours ago 0 replies      
This could be great for foreign language learners!

Imagine integrating this into Anki: "Please negate this sentence", "Please turn this sentence into past tense", "What's the direct object", and so on.

Perhaps "this sentence" could refer to some interesting sentence from an article that you read in Pocket last week, for example!

fiatjaf 1 day ago 0 replies      
How would I proceed to turn

"this library is great." into "all other libraries aren't great." ?


BinaryIdiot 1 day ago 0 replies      
This library is fantastic. I've used in some side projects and while it's far from perfect it hits that "good enough" space pretty well.
pknerd 19 hours ago 0 replies      
Is there something similar available for PHP || Python?
fizzbatter 1 day ago 0 replies      
Anyone know of any good Rust libraries in the NLP landscape? It seems quite dead in Rust. Only partial libraries, and non-compilable libraries.
iamgopal 1 day ago 1 reply      
What a day it would be, when nlp natively supported in browsers and os.
MrBra 19 hours ago 0 replies      
What's so cool about this in a way it hasn't been cool before?
rattray 1 day ago 0 replies      
Can anyone explain how this works?
0b01 1 day ago 0 replies      
Title is misleading clickbait. Should be 'to process natural language.' or a cool way to use NLP.
moioci 1 day ago 0 replies      
Somebody has to point out that the past participle of swim is swum, not swam. (/pedantic-mode
marknadal 1 day ago 0 replies      
Wow, this is an excellent README if I have ever seen one. And the project is most excellent as well. Great work.
MrBra 1 day ago 1 reply      
Wow, and why exactly did this make into HN first page? Because it's Javascript?
58028641 1 day ago 4 replies      
Instead of trying to train computers to parse our langueges, why don't we improve our languages so that they are unambigious and each word has one meaning and each meaning only has one word. If only there was one universal language that had no exceptions.
Concerns for global spread of Zika mean Rio de Janeiro Olympics must not proceed harvardpublichealthreview.org
431 points by bd  4 days ago   201 comments top 21
ChicagoBoy11 4 days ago 9 replies      
This article is idiotic if the main reason for cancelling the game is actually some "global spread" of Zika.

Rio is not some city in the middle of nowhere from which no flights are coming in and out of, and now suddenly this massive international influx of visitors will come in -- it is a world-class city, a financial hub of the largest country in the continent.

If we take London 2012 as a benchmark, estimates there are that the olympics yielded a 13% increase in foreign visitors from the previous summer. And that is London, a place where access from other destinations is cheaper, the infrastructure better, and a lot more appealing to U.S. visitors (due to some of the things cited above). It is completely reasonable to expect that the delta in Rio should be roughly the same, if not smaller.

The fact is that the"impact" on travel due to the Olympics will be negligible in a city like Rio -- and the economic impact is CERTAINLY negative, unless you are drinking the Kool-Aid of organizers.

I'm fine with cancelling the games, but claiming that not doing so will lead to a much riskier global health situation than the one we are already living is hogwash.

ehmuidifici 4 days ago 1 reply      
Brazilian here, Ive been in Rio a couple of times (living in Sao Paulo right now, who also has problems w/Aedes Aegypti):

Come to Brazil if you wish, but take care. The Aedes Aegypti mosquito is all around and can spread not only Zika, but Chikungunya (kindly named "chico cunha") and Dengue.

Make good use of insect repellent, watch your stuff when walking on beaches (don't forget robbers and thieves) and think twice about trusting someone.

As someone said before, every city in the world has its problems and brazilian cities have them also.

Personally, I will not go to Olympics just because hotel prices on Rio skyrocketed and are almost impractical.

apalmer 4 days ago 2 replies      
It should be considered, dont know if it makes sense but the pros and cons should be weighed and respected international health organizations should weigh in.

Overall though, think given the fact the facilities have not been fully constructed, the Brazilian government is on the verge of collapse, and the risks of being a major disease vector is plausible...

Should definitely consider delay or moving

wyldfire 4 days ago 4 replies      
Ever since the news of Zika causing microcephaly came up I have been wondering whether it's a net benefit to humanity/homeostasis if we were to exterminate Aedis Aegypti. Can Zika spread among humans already? It would seem like a huge risk to the future of humanity if this infectious disease interferes with reproduction.

What's the worst-case scenario for letting Aedis Aegypti live? Is it better than the worst-case scenario for exterminating the species?

bhouston 4 days ago 4 replies      
I was wondering about whether this would happen. I am not enough of an expert to make a call on this either way of course, but if experts decided this was the right call would it even be possible to cancel/reschedule the Olympics? Imagine the financial impact on Rio.
dsfyu404ed 4 days ago 1 reply      
I wonder if the medical community is also working with the state department, airlines and travel agencies to make sure people get a stern warning before they buy plane tickets to Brazil and an even sterner warning when they get back.

Considering the political forces that need to be involved in moving (or not) the games it seems foolish to not have a safety net if "please move the games" doesn't work.

S_A_P 4 days ago 0 replies      
Zika is already in Houston, Dallas and other Texas cities. I dont know if it is tourism that brought it here or not, but it has enough transmission vectors that its only a matter of time before its everywhere.
neves 4 days ago 0 replies      
What will prevent the outbreak is the weather in Rio. The Zika mosquito takes 3 to 4 weeks to reproduce http://www.denguevirusnet.com/life-cycle-of-aedes-aegypti.ht...and it loves hot and wet weather.

The autumn in Rio is the coldest and driest in a long time http://g1.globo.com/rio-de-janeiro/noticia/2016/03/outono-no...When the Olympics arrive, the mosquito will be in their lowest population count.

If you consider that the Olympics tourists are rich ones, that will spend most of their times in air conditioned environments, the risk is really very low.

DannyBee 4 days ago 3 replies      
Serious question: Has anyone seen an article or study about the microcephaly effects on pregnancy where the main population studied was not in brazil?

I have yet to see a study that links it to microcephaly for anyone who has not been in brazil for a long time.I realize the mosquito isn't common, but brazil also does things like: "secretly spray tons and tons of pesticide on their populus", etc.

Given these are all correlative studies, i'd love to see something from a country where there aren't a ton of possible other variables.

(especially given zika has been around and even common forever and it's only now that this seems to be an issue)

You know, before the tourism economies of all of these other countries are completely destroyed.

transfire 4 days ago 1 reply      
What? Are they planning a culling, on which they will blame Zika and the Olympics? Otherwise, this make zero sense.

Despite all the hype, Zika is not the end of the world. Most people who get it hardly notice and get over it in rather short order just as one gets over a cold.

davesque 4 days ago 2 replies      
So does "fold" just mean "times"? I always thought it meant 2^x times.
mc32 4 days ago 0 replies      
That's not going to happen -- not when their economy is in a nosedive and they are embroiled in impeachment proceedings.

At most they'll spray and fumigate the skeeters for the few weeks the activities take place and then go back to "normal".

ck2 4 days ago 1 reply      
Could people accidentally bring back live mosquitoes in their luggage and such that are carriers?

If so, could spread the world at lightning speed.

I think there is also concern for the athletes in that the water is absolutely filthy there.

askyourmother 4 days ago 4 replies      
Come to Rio - watch the Olympians swim in the very dirty water they promised to clean but didn't! Catch zika, then fill out the police and insurance forms after you get robbed! Seriously though, Rio is a very very dangerous place to visit even for brasileros, let alone tourists, and the zika and chinkungya virus outbreaks are not helping either.

Watch it on TV, from far away.

tim333 4 days ago 0 replies      
It's kind of a shame that there is probably a solution to Zika sitting there waiting for regulatory approval in Oxitec's GM mosquitoes which were first successfully trialed in 2009. Maybe after another one or two million people have caught Zika they'll get around to using it.
sremani 4 days ago 0 replies      
I am mixed about this whole episode, at one level there is bias against developing countries. On the other end, Brazil did not do any favors for itself, with zika, political turmoil etc. But they did pull of a FIFA world cup, so I am a bit more optimistic.
kordless 4 days ago 0 replies      
Good luck on that. People in dissonance don't want to hear about it.
msimpson 4 days ago 0 replies      
Thanks to this article I am now aware of a man named Dick Pound.
mariusz79 4 days ago 0 replies      
The show must go on.
mhurron 4 days ago 2 replies      
> I live a longtime in Rio and never had Dengue, Zika

Don't be so sure about that. Most people who contract both of those don't know that they have them as the symptoms are so mild they are mistaken for a simple cold.

But no, it's just racists pretending that diseases can spread.

You can relax anyway, there is too much money to be made by those running the games in Rio to have it canceled. No one actually cares if people get sick.

ramon 4 days ago 0 replies      
Just wanted to reply to all and say: If you are afraid of Zika, Dengue or whatever the mosquitos transmits, don't get out of your houses! I will go after those bastards!

Best to all,Signed: The mosquito killer! :p

Scala Native github.com
465 points by virtualwhys  4 days ago   259 comments top 15
openasocket 4 days ago 2 replies      
Is this a "true" Scala? As in, if I write a Scala program that only uses the scala stdlib, will it run on both the JVM and Scala native with no modifications to the source? Or, is it more like "a Scala" in the same way you would say "a Lisp"? I think a lot of the design decisions for Scala were made so that Scala would work on the JVM and easily inter-op with Java. It might make more sense to modify the language slightly to better suit the native environment. I'm seeing some hints to that on the page with the "@struct" and "@extern" decorators.
densh 4 days ago 13 replies      
Hi all, I'm the author of the project and would gladly answer any questions.
dsabanin 4 days ago 2 replies      
Really excited about this. This is another reason why Scala is an extremely valuable tool these days. With Scala JVM, Scala.js with React Native and now with Scala Native (LLVM), there'll be literally nothing you can't do well with it.
spriggan3 4 days ago 2 replies      
Just saw this on twitter. Scala, you have my attention. There were a lot of talks about "the tools of yesterday" and "the tools of the future" lately. Scala getting closer to the metal, without the JVM is a significant step toward "the tools of the future".
abc_lisper 4 days ago 2 replies      
Hi, Some questions

- Can this reuse existing Scala code?

- How does it compare against Rust/GO/Swift? Why use this over them?

- How about libraries?

CheKhovHemingw8 15 hours ago 0 replies      
Another thread covers in hands-on way Getting Started:https://news.ycombinator.com/item?id=11699897
gnufied 4 days ago 3 replies      
So, is this "released" yet? There are no downloadable installers or even instructions to compile.
grandinj 3 days ago 0 replies      
Given that Apache harmony is no longer being developed, you might want to coordinate with the Avian people, since they also reuse harmony.
smegel 4 days ago 2 replies      
What about GC?
partycoder 4 days ago 1 reply      
In a way, simple Scala code resembles Swift, as commented by this person:https://leverich.github.io/swiftislikescala/

Now the type system of course is entirely different.

OhHeyItsE 4 days ago 3 replies      
Otherwise known as "Haskell"?
jiang101 4 days ago 0 replies      
Wow, they're even planning to provide a native version of the runtime library? That will be more than nice.
airless_bar 4 days ago 6 replies      
Not the author:

> - Can this reuse existing Scala code?

I think that's the plan. Would be a pretty pointless exercise without that, right? :-)

> - How does it compare against Rust/GO/Swift? Why use this over them?

Rust: Scala and Rust have different niches. Rust is more focused on low runtime overhead, while Scala is more focused on low development overhead. This means Rust can be potentially faster to run, but Scala is faster to develop.

Go: Go is utter shit that only survives due to the devs name-dropping "Google" every 5 minutes.

Swift: Scala is more mature, simpler, better designed. Not a knock against Swift, but there is a difference between a language where changes have been tried experimentally for years before either adopting or removing them, and Swift where things get added at a frightening rate.

> - How about libraries?

Libraries without Java dependencies should work, libraries with Java dependencies depend on whether their Java dependency is provided by Scala-Native (just like on Scala.js).

dang 3 days ago 0 replies      
You can't attack someone like this on HN no matter how bad their contribution to programming language discussion on the internet.

If you think you see abuse here, do please let us know at hn@ycombinator.com. We promise to look into it. But single-purpose accounts, let alone ones targeting individuals, aren't allowed here, so we're banning this one.

We detached this comment from https://news.ycombinator.com/item?id=11681233 and marked it off-topic.

dang 4 days ago 1 reply      
Personal attacks are not allowed on HN. We ban accounts that do this, so please don't do it again. Instead, please (re-)read the site guidelines and make sure your posts are civil and substantive.



We detached this subthread from https://news.ycombinator.com/item?id=11678838 and marked it off-topic.

Prison phones are a predatory monopoly One family fought back and won theverge.com
370 points by some-guy  4 days ago   104 comments top 17
avs733 4 days ago 3 replies      
While I find the conversations in here interesting from an economics perspective they are not solutions to the real problem. This is not a function of cost between inmate, provider, and family alone. These phone systems have a real and tangible cost on society as a whole. An enormous body of research [1] showing that offenders with better/more familial contact while incarcerated have vastly lower recidivism rates.

This is literally companies causing harm (and not just economic) harm to societies citizens at large. I respect jlafon's point of view but I can't agree. The fact that a system you create is difficult to adminster should not mean that the cost of dealing with it should be passed along to your 'customers' (gagging as I use that word). When a group of people chooses to put others in a position of limited power they have a responsibility to protect them from harm. Treating prisoners as a revenue stream at all is immoral and I believe unconstitutional. The argument that they should pay or do anything to contribute to their imprisonment is vapid and ugly. If we aren't willing to shoulder the burden of imprisoning them then we shouldn't do it. We absolutely should not be charging them or their families usury amounts of money to satisfy rules and situations we created.

Letting prisoners use the phone is labor intensive? Why? because you created rules and a system where it is. To spin it as more complicated or containing 'reasons' is post hoc justification nonsense and should be treated as such.

[1] Summarized here: https://www.prisonlegalnews.org/news/2014/apr/15/lowering-re...

c3534l 4 days ago 5 replies      
Why have we decided that the way to treat criminals is to systematically destroy their social support system for profit? This sounds like a terrible idea, and a direction that doesn't seem to be improving the American prison system in the least.
ndespres 4 days ago 1 reply      
Thanks for sharing. I'm glad this is staying on our radar lately.

There was another link discussed here recently (https://news.ycombinator.com/item?id=11648361) about how these exorbitant prison phone calls are being replaced with video calls- and ONLY video calls. In the linked article we have this quote: The alternative to high rates isnt lower rates, the association has suggested the alternative is that phone calls in jails will be done away with entirely. "Absent these commissions," association president Larry D. Amerson wrote in a comment to the FCC, "counties would need to either increase taxes for the system or jails could potentially cease to provide inmates with this service." So either continue to support this monopoly, or don't speak to or see your brother/cousin/mom in jail at all.

Here in New Jersey where I live, as of yesterday you can no longer visit an inmate in a couple of our prisons, in person. Instead, you can pay Securus for a video connection to the inmate you'd like to speak with. I think if more people who were not directly connected to the System via a friend, family member, or personal experience were aware of what's going on, they would be appalled. Instead we conveniently pretend this stuff isn't happening.

From the linked article: [Securus'] Smith defended his companys profits on many of the same grounds other inmate phone companies do. The contracts, he says, are a source of funds for crucial corrections services like health care. "Its really a public policy issue," Smith says. Securus also provides security services, recording calls sent through its system and intervening to break up any illegal plots that it detects. "We really feel like we perform kind of a noble service for society," he says.

What he's not saying is that local municipalities can also get a kickback from the money paid to contact prisoners. So not only does it fund healthcare within the prison system (which of course are also increasingly privatized, so how much of that money do you think can be claimed as profit by the company running the prison), but to fix potholes etc in the local town.. on paper, at least.

What I wish these stories left me with is what to do next. Who do I call, petition, or vote for to get this changed?

koolba 4 days ago 8 replies      
If you give out monopolies, then this is what happens. Here's a simple idea to fix this: capitalism.

Mandate at least two providers at each prison and let them charge whatever they want. Let them race to the bottom so you get the same cheap voip rates the rest of the country has access to.

Oh and if they collude on pricing, throw the management in the same prison.

I bet they'd also start competing on the features the prison cares about too. Like tracking who's calling who, speech to text transcripts, and service levels.

Problem with this approach is that it doesn't allow for the cronyism that is ripe in this type of industry.

hermannj314 4 days ago 1 reply      
Our local jail charges for personal visitation, you get 2 visits per week free but can pay a "nominal" fee to stream additional conversations over the Web. Let's just say that fee was ridiculous. I can't find the link for it was $30 for 15 minutes I think.

As a former foster parent that was just trying to connect with the birth parents while they awaited trial, yeah those prices suck.

But hey, who cares about people accused of crimes, right? That's the American way.

jlafon 4 days ago 7 replies      
First of all, I'm not defending what is obviously predatory. However, there is more involved then what you might think at first glance. Right or wrong, correctional facilities have reasons to discourage phone calls (context: I put myself through college working at a maximum security prison). Calls are supposed to be monitored (usually done manually) to prevent criminal business from being done on prison phones - and there are never enough people to listen to all calls. There are never enough phones either, which frequently causes tension between inmates using phones and those waiting for them. In higher security levels phones are labor intensive. An officer has to escort a (potentially dangerous) person from their cell to the phone, and stand there for the duration of the call. And to the article's point, it's such a problem that prepaid phone cards are a form of currency on the inside.
electic 4 days ago 1 reply      
I think this is a very unique article from a web design perspective. The counter on the left hand side, indicating the time you've been reading, and how much your charges would be if you spent that time on the phone is genius. It really hammers in the point of how unethical this practice is.
rbobby 4 days ago 0 replies      
What a horrible thing to do to innocent American families. Bad enough their loved ones have fucked up mightily enough to be incarcerated but now the state gouges the hell out them just to talk to each other.

This is not how a government should treat its people.

Overtonwindow 4 days ago 0 replies      
It's getting worse. Prisons are now forcing inmates and families to use video visitation, eliminating all in person visits.


Zigurd 4 days ago 1 reply      
Think what a thrashing, including calling out the founders and management, companies like uBeam and Theranos get here. Ghouls like prison telcos are 100X worse. Where do we get people who run these operations? Who are they and what makes them tick?
mason55 4 days ago 0 replies      
I shared an office with a guy who did a couple years for white collar crime. He got out and started a business that placed local voip numbers near prisons then patched the calls through to long distance numbers. He charged way less than the prisons were charging for long distance calls.

He was making a ton of money last I talked to him.

mynameisnoone 4 days ago 0 replies      
Hillary (and other politicians on both sides, from local to federal) get a ton of for-profit prison money. No wonder.

EDIT: VICE did a piece in 2014 on people getting locked up because they could not pay their parole fees. Yes, debtors' prison, where parolees pay (or not) for the privilege of freedom. https://news.vice.com/article/debtors-prisons-are-taking-the...

njloof 4 days ago 0 replies      
Can't we just let them have cell phones and let law enforcement tap them by getting a warrant?
kingmanaz 4 days ago 0 replies      
Seems most of mankind's daily labors are inclined toward predatory monopoly these days; either creating their own or wage-slaving toward preserving another's. One laughs with today's comedians as they parody the manners of yesteryear, those musty, pinkies-out concepts of gentlemen and gentlewomen, that gullible faith in the golden rule, yet one's teeth are soon sent gnashing when those many insurances which buttress men's insolence are found to be effected by the same selfish, hard-hearted men as oneself.

Rather than Thoreau's "quiet desperation", the masses instead seem bent toward lives of "clawing desperation".

venomsnake 3 days ago 0 replies      
Is there a good reason prisoners not to have 24/7 internet access and phones for free. Even if monitored.

Hell - give them a free WoW or LoL accounts and they may forget to come out of prison once their term is over.

martin1975 3 days ago 0 replies      
something about "prison" and "winning" used in the same sentence had me immediately peg this article as TLDR...
A former CIA spy has revealed his key role in the arrest of Nelson Mandela thetimes.co.uk
424 points by randomname2  17 hours ago   264 comments top 31
cm3 16 hours ago 10 replies      
The CIA is the reason for most of the unstable regions and it's hard to understand how after messing with other places and making it worse for everybody but weapons manufacturers, they're still doing the same thing over and over. Previously it was the fight against Communism, now Islam, I wonder what it will be in 2020. Middle-east, Asia, South America, all are right to be very angry with the CIA, but turn it into a "Burn America" rethoric, which doesn't help their argument. CIA's policy is probably controlled by someone else, and that's where changes need to be made because I like to think they don't come up with the stuff on their own.
zo1 11 hours ago 0 replies      
It's kind of odd that the DailyMail and RussiaToday (RT) have more "info" than the linked article that's paywalled. Not necessarily a bigger "scoop" of the story, but they provided more info about the parties, backstory, pictures included. Honestly, I tried looking for a decent article after encountering the paywall, but most were just "rehashed" quotes and links to the TheTimes article, with absolutely nothing of substance added.



Anyone have a decent article on this with more info and backstory?

mark_l_watson 13 hours ago 0 replies      
Not too far off topic:the book "The Devils Chessboard", the history of Allen Dulles and the CIA is informative. The author was at Harper's Magazine for decades, and this book is full of personal accounts and information gleaned from historical records. The book is an eye opener. I was particularly shocked by how Dulles suppressed information of the holocost as it was happening, preventing obvious steps like bombing the railway lines leading to the death camps. Dulles was so fixated by the communists that any actions were to him justified.

Edit: Dulles was also fixated by protecting the interests of his law firm's Wall Street clients and he viewed the Nazi apparatus, minus Hitler, as a potential resource, something to be protected and largely left in place after the war.

Tharkun 15 hours ago 4 replies      
You know how we're still hunting down and jailing formers nazis and concentration camp guards? Why aren't we doing the same to these CIA agents and decision makers?
sehugg 13 hours ago 0 replies      
NYT reported this in 1990, we just didn't have confirmation and the name of the agent: http://www.nytimes.com/1990/06/10/world/cia-tie-reported-in-...

And it's still unclear whether anyone else was involved in the decision to tip off the police.

fiatmoney 10 hours ago 0 replies      
Mandela & his organization were responsible for a terrorist (in the classical sense - using unfocused attacks on civilian populations as a bargaining chip and method of instilling fear) insurgency that killed thousands of people in incredibly brutal ways. The ascendancy of that government has resulted in South Africa turning from a reasonably prosperous & stable country into one of the most dangerous countries on earth, especially for the white minority, who is ~ one election away from genocide at any given moment.

Props on Mandela for being gracious in victory & not immediately going Full Zimbabwe, but it is insane to suggest that he was some sort of sainted figure that there was no reason to even fight.

dijit 14 hours ago 6 replies      
Given the serious effort CIA puts into destabilising regions (Lets not forget the large amounts of sources who said people were being ferried and paid to go to Kiev when the Ukrainian uprising was occuring) it would not surprise me very much if they were also behind the Britain "exit" from Europe, since that would cause an economic collapse of the EU and would remove a super-power from play.

I feel like a conspiracy theorist when saying it out loud, but given the history...

I mean it's economic suicide for britain to leave the EU, yet someone is plastering it all over the media and it's not the politicians. :\

mrslave 14 hours ago 4 replies      
In the interest of some context: Mandela was a communist and a terrorist. While a "political prisoner" he was offered freedom as soon as he would publicly renounce violent protest (i.e. terrorism) and he persistently refused to do so. His second wife, Winnie, enjoyed the necklacing opponents. So nice people all round.
brudgers 10 hours ago 0 replies      
Pica_soO 11 hours ago 0 replies      
Agency's like the CIA are just symptoms, the obvious to late, to little effort done after problems seem unsolvable by politicians. And they often are.If you have a population that basically votes with there feet for a war every second generation. If you have a social system that depends on constant corruption, to get everyone through the day. If you have resources that allow those in power to get by without any responsibility to there country's. If you got armed groups, which provide to second sons of a second sons a future in guerilla warfare - what is a group of man in a office building to do about that? The answer is nothing.They cant turn stone to bread. They cant reason with the unreasonable.They can try to gamble who gets to sit on the iron throne.The sad fact is- the CIA- on a larger scale, has no power over the population, which going in murderous cycles, stomps there own future into the ground.They make a excellent blame-pinata though.
gfgjmfgjmgh 16 hours ago 1 reply      
The CIA was also behind the assassination of Homi Bhabha (father of Indian atomic energy program) and Lala Bahadur Sastry (Prime Minister of India).
thomasahle 15 hours ago 1 reply      
I think it's interesting how the CIA's goal seems to be more or less "maintaining the status quo" in the global power struggle. This is obviously a useful thing to have, but if it gets too strong, how can we ever have the positive revolutions that have given us things like democracy?
vidoc 16 hours ago 1 reply      
Oh my God! This can't be true can it? I dont want to wake up tomorrow and learn that the Iraq war adventure was not based on wrong CIA intelligence, but a manufactured one, and 100% of the US political establishment knew it (in addition to basically the rest of the world).
bunkydoo 14 hours ago 0 replies      
I wonder what would happen if we just told everyone at the CIA to go on a 1 year vacation. We might have world peace
ccvannorman 11 hours ago 1 reply      
""The agent firmly believed Mandela was in the pocket of Communist Russia and was planning to incite the Indian population in the Natal region, where he was based, to rise up.""

Given ubiquitous surveillance and increasing state power, how long before average citizens start getting locked up because a single cowboy at an agency "firmly believes" they are a threat?

raverbashing 16 hours ago 2 replies      
Well, sending him to jail helped make him an important figure. True, there's no way of knowing how it have gone otherwise

Not to be demonized but not to be idolized as well

tempodox 16 hours ago 4 replies      
> ...the worlds most dangerous communist...

How fads change. Had it happened today, they would have called him a terrorist.

Sadly, it's not really news that the U.S. only pay lip service to Democracy. After all, the dictator of our choice can guarantee our economic and military interests much better than any democratic regime.

proksoup 10 hours ago 0 replies      
The new title "an individual admitted something not the organization" is definitely more accurate, but represents an unsettling (to me) blame shift. I suppose the original title incorrectly placed responsibility on the organization and not the individual.
JHof 11 hours ago 0 replies      
Here's a very similar article from 1990 - http://www.nytimes.com/1990/06/10/world/cia-tie-reported-in-...
Lerumo 9 hours ago 0 replies      
Find it funny that they label Mandela a communist while the apartheid government owned all important sectors of the economy eg telkom (telecommunications), eskom (electricity), amscor n denel (weapons), sabc (radio and tv broadcasting), spoornet (rails and ports) acsa (airports) just to list a few. Apartheid was basically socialism for white people and brutal facism africans.
irunbackwards 10 hours ago 0 replies      
Of course we did. We've been behind almost every coup of any truly democratic (we Americans like to call this communism) society in the last century.
nxzero 16 hours ago 0 replies      
ertyui 16 hours ago 1 reply      
Linkbait title. If the testimony is true then at best its a tip off about a then wanted man's location, nothing to do woth cia behind a conviction or anything else.
ebbv 12 hours ago 0 replies      
The title of this link is misleading. A former CIA spy (is that even confirmed?) claimed it. That's incredibly different from the CIA actually admitting it.

Which is not to say I really doubt his claims (assuming it can be verified he was a CIA operative who was in South Africa at that time.)

known 9 hours ago 0 replies      
"If there is a country that has committed unspeakable atrocities in the world, it is the USA. They don't care." --Nelson Mandelahttp://www.huffingtonpost.co.uk/mehdi-hasan/nelson-mandela-i...
c2the3rd 11 hours ago 3 replies      
I would like to thank all the people who had a comment on this article, but did not post it. Truly, you improve the community with your silence. I'm aware of my own hypocrisy on this point, but logical consistency would prevent this sentiment from ever being expressed.

As it is, perhaps 5% of these comments know anything about what they are talking about. One of the biggest intellectual failings of the sort that frequents this place is mistaking being smart with being informed.

hugh4 16 hours ago 2 replies      
Not really an "admission", he was guilty of the crime for which he was charged.
franky303 16 hours ago 1 reply      
Paywalled. Next.
DonHopkins 11 hours ago 0 replies      

 struct NelsonMandela *nelson_mandela = (struct NelsonMandela *)malloc( sizeof(struct NelsonMandela)); free(nelson_mandella);

equalsnil 15 hours ago 7 replies      
Would South Africa be a better place if it had become a Russian-influenced Communist country?

If Mandela was a secret operative for the communists in South Africa, as the local communist party claimed when he died, and as this former CIA operative has claimed, then he had to have known he was playing a tricky game. Ending apartheid, creating a communist paradise, pick one.

[Edit] I know I'm being downvoted, but I also doubt anyone commenting here lives in a communist country, or has ever spent much time in one.

A professor built a chatbot to be his teaching assistant washingtonpost.com
376 points by dlgeek  2 days ago   114 comments top 23
Bartweiss 2 days ago 3 replies      
This is definitely an impressive result, but I'm pretty sure it was made easier by the opposite pattern.

I've had a couple of TAs who I strongly suspected of being chatbots, even when I was talking to them in person. I'm still not 100% convinced they were human.

forrestbrazeal 2 days ago 2 replies      
I was a student in this class, and had at least one question on the forum answered (correctly) by Jill. I can see this technology being hugely useful for teachers who conduct large lecture classes in any subject on a regular basis.

That said, I'd be even more excited if Jill had the ability to synthesize new answers to questions through some type of case-based reasoning. This would require Jill receiving feedback on "her" answers, which might mean the students would have to know "who" "she" was in advance. (Sorry, got lost in the quotes.) Right now, Jill is essentially an automated FAQ-retrieval bot.

lpage 2 days ago 3 replies      
For anyone interested, the class, which covers interactive intelligence and knowledge based AI, is freely available at https://www.udacity.com/course/knowledge-based-ai-cognitive-... . It's a great starting point if you're looking to automate the sort of human intelligence tasks that don't lend themselves well to the traditional searching/planning/proving/minimax/retrieval route that underpins most AI.

I took it before they rolled out the chat bot TA, unfortunately.

ars 2 days ago 4 replies      
This is the key:

> The system is only allowed to answer questions if it calculates that it is 97 percent or more confident in its answer.

Knowing when you don't know the answer is something that has been lacking from AI I've seen. Of course the problem is that not all AI has the option of deferring to a human.

Anyone know what percent of questions the AI answered, vs redirected?

collyw 2 days ago 7 replies      
Rather than inventing sophisticated chat bots, couldn't he have just had a well organised web page? If the question were where is assignment two / when is assignment due. The examples given seem like a "high tech" (possibly unreliable) solution to a low tech problem.
mdorazio 2 days ago 1 reply      
"Goel plans to use Jill again in a class this fall, but will likely change its name so students have the challenge of guessing which teaching assistant isn't human."

This will be a fantastic Turing test, at least as far as basic question understanding and natural response formation.

wyldfire 2 days ago 4 replies      
That's stunning. I studied AI in undergrad and I just didn't give it enough of my focus at the time. Little did I know that in my lifetime I'd see real world evidence of computers passing the Turing Test with flying colors.

Of course, Turing was right. What's outside of the set of computable sets? Not much!

I recently read an article headline that talked about creeps using AI to stalk porn stars. It's interesting to think that as computation power continues to grow, there's a huge potential for evil uses of AI. All of the things we see and hear and consider "public" have little or no nefarious uses to other humans. But what about a computer that has human-like capabilities and inexhaustible computation resources? The sci-fi scenarios of conscious AI making decisions to save humanity from itself are not what I fear. The evil humans sitting at the helm of powerful AI is what I fear.

A lot of society's norms are predicated on humans having private thoughts and telling small and large lies as appropriate. What happens when AI-augmented humans know when you're lying?

The future is so exciting and so terrifying.

pesenti 2 days ago 0 replies      
For those wondering what technology is being used. It is using this Watson product: http://www.ibm.com/smarterplanet/us/en/ibmwatson/engagement_.... We are in the process of revamping it and making it a self-service API that will be released this quarter in http://ibm.com/watsondevelopercloud. Some of the core functionality is already available in http://www.ibm.com/smarterplanet/us/en/ibmwatson/developercl... and http://www.ibm.com/smarterplanet/us/en/ibmwatson/developercl... in case you don't want to wait.
lordnacho 2 days ago 0 replies      
Makes sense as a use case. TAs must hear the same questions over and over again from different students, both practical things like "what homework is there" and reference stuff like "What does the line above the X mean?" (No pun intended!)

I wonder how good it is when people don't know what they're asking. This used to happen all the time when I was in tutorials. "I don't get why there's two transistors in a chain". "What's the significance of the process being adiabatic?"

If you can crack that, education will be changed forever, massively.

ikeboy 2 days ago 3 replies      
>Goel and his teaching assistants receive more than 10,000 questions a semester from students on the course's online forum.

Is this normal?

srtjstjsj 2 days ago 1 reply      
Actual URL path:

 /this-professor-stunned-his-students-when-he-revealed-the\ -secret-identity-of-his-teaching-assistant/
HN can do better than clickbait sites like the WaPo.

lucb1e 2 days ago 1 reply      
I can't help feeling a big FAQ page with a search function would be a lot more useful. Especially if the entries contain aliases for common search terms, like "length" on the entry of "word count", it should cover everything.

Now this is a forum where others can chip in as well, but the FAQ page could prevent you from having to post in the first place. And if you don't have to post publicly to get an answer from the chat bot (because it's a FAQ page and not a bot) you can do 20 searches before deciding it's not in there.

alexcaps 2 days ago 3 replies      
This sounded cool until, "Now Goel is forming a business to bring the chatbot to the wider world of education. // are all educators just trying to create companies these days?
jerryhuang100 2 days ago 0 replies      
in a way i'm not sure why there is still no parents or students complaining they're kind of being 'defrauded' for a chatbot while paying high tuition (cue discover card tv ad.) one thing for sure is that there are a lot of undergrad or grad students around looking for a part-time job and teaching experience. i know this is a great achievement in AI with huge potential in open course, and great for grant application, but on the very other end is someone paying with an expectation of a human interaction & care. for years i've seen a lot of complaints about foreign human TAs.
eternalban 2 days ago 0 replies      
"Education is such a huge priority for the entire human race."

Half a century into this current iteration of the game called 'Life as a Human on planet Earth' I have come to the conclusion that our primary focus must be Education and Mental Health. Take care of these and we may yet see our full positive potential.

mathheaven 1 day ago 0 replies      
The next step is to build a chatbot to replace the professor at teaching. If someone is not going to enjoy the fruits why are you going to supply the knowledge base for a chatbot that is going to replace you?
zeeZ 1 day ago 0 replies      
The site comes with an obnoxious "enter your email" overlay that won't go away and messes up scrolling when you block it. They do accept @washpost.com addresses though, so...
xufi 2 days ago 0 replies      
Pretty amazing. Considering that I've only really taken 1 online class so far excluding tutorials that I've done. I wonder if one day how this would be used for real responses for a real professor someday
thaw13579 2 days ago 0 replies      
I wonder how many of the responses were "check the syllabus"
deepGem 1 day ago 0 replies      
I wonder what the 'layers of decision making software' on top of Watson mean. Are these rule engines ?
sandaru1 2 days ago 1 reply      
Anyone know whether they used watson APIs to create this bot or used the original watson codebase (with IBM collaboration)?
cdnsteve 2 days ago 2 replies      
Anyone have technical details about this story? What kind of bot used, languages, algorithms, etc?
exodust 2 days ago 2 replies      
Already posted in original form: https://news.ycombinator.com/item?id=11688061

I don't know why SMH would take a Washington Post story and change the title to something stupid, while keeping everything else the same, but that's what they've done.

Wait.. there's one more difference. A pathetic picture of "an AI".

"an AI"? Seriously SMH, just use the original title. Your editorial blundering is embarrassing, and could easily be replaced with "an AI" tasked with re-wording headlines.

That'll do, pig, that'll do medium.com
402 points by exolymph  2 days ago   200 comments top 39
Animats 2 days ago 9 replies      
This isn't about an "app". This is about a business that initiates transfers of money from one account to another. The people behind this didn't realize what business they were in.

If you want to initiate transactions in the financial system, you have to work through what happens when things go wrong, and be prepared, financially and operationally, to handle them. How does the money get from the parent's account to the kid's account? Note that if the kid can get cash out, you can't fund this with a credit card. Visa and MasterCard don't let you buy money with a credit card; that's too fraud-prone.

So, as someone pointed out, you're back to ACH debits. Getting set up for those is hard, for good reasons. It gives you a connection to other people's bank accounts. So you need financial strength, bonding, and good references.

Note that the chore list has to be secure, too. Otherwise the kid can add "Mow lawn, $1000", check off the item, and drain the parent's account.

If you try to do this by having the parent transfer money to you in advance, and you then release it to the kid later, you're now a depository institution. In most states, you have to be a bank or a money transmitter to do that. (It's so tempting to take the money and run, and that's happened enough times that such businesses are regulated.) Also, that's a pain for the parent. They might as well use Venmo.

PayPal was successful partly because, back when they were above the bike shop on University Avenue in Palo Alto, they started as a security token maker. So they had security people and were familiar with the problems. A big portion of PayPal's operating costs come from dealing with fraud. The legit transactions are fully automated; the problems require a big call center.

qq66 2 days ago 10 replies      
I feel that there are structural problems with this business concept that wouldn't be solved by a bank partner / additional capital / etc. Note that I am not a parent.

The fact is that kids should not be paid for chores on a chore-by-chore basis. This is perhaps the way that you should do things if you just need labor from your children, and want someone who will rake the leaves for cheaper than a hired gardener. But most parents also want to teach their children good values, and one of the most important social things to learn is about reciprocity in relationships. In transactional relationships, where you will perhaps never see the person again, reciprocity must be established immediately (paying for a cab ride, tipping a pizza delivery person). But in social relationships, immediate 1-for-1 reciprocity actually damages the relationship. If you were invited to a dinner party, would you try to pay the host for the cost of your food? Of course not. If it's casual friends, you might bring a bottle of wine, and if it's very close friends, you may not even do that, as you know you will be inviting them over at some point. Paying a child per chore teaches them to treat their family obligations as transactional obligations, rather than teaching them about the nature of reciprocity in a close personal relationship.

Also, when teaching kids about money, I think that many parents prefer to teach kids about "the value of a dollar" with cash, rather than with a bank balance. My parents opened a bank account for me in 4th grade, which was very unusual at the time, to encourage me to save my money, but even then I found that spending money was and is a lot easier when it's a number on a screen than when it's paper money that you take out of your wallet. By first finding pennies and nickels on the street, then getting $5 bills from their parents, and then graduating into a bank account, people learn to appreicate just how many found nickels it takes to buy a flatscreen TV.

xigency 2 days ago 6 replies      
I'm not entirely sure why there needs to be a financial services or banking contract to build this app and startup. What's wrong with PayPal? Or bank accounts? Or Stripe/Square/Braintree? Paper checks?

This might be a little harsh for what seems like a sincerely useful app and startup, but the problem with the "Yo" startups is that it's just raking leaves (money) to compost them or light it on fire. If the value is in the app, sell the app. If the value is in other people's products and services, then there's no value in it. And if the value is in the "idea" then it's hopeless. If the value is in the fundraising, then we're lost.

What I don't see is why the idea, PiggyBank, couldn't be functional in the first week. Just like I don't see why a company needs $X billion with no imaginable costs or expenses to put that money toward.

Let's say it isn't possible to do mobile-to-mobile payments (which I think it is), or online payments, without owning a bank then why not release the same app and when you do the chores for your parents or your uncle they put money in a jar? Beyond not making revenue, having users is at least some sort of reward. When you have the revolutionary breakthrough, add that to the app.

And finally, don't leave your job over something you think is a good idea. Make it a reality, and if it takes all of your time and it can support you, then make the jump.

Obviously these aren't things any founder wants to hear, but someone with a greater risk aversion might be able to be as productive with a more realistic focus than the average SV press release.

Edit: Obviously this is a little nave considering all things, but I'd still like to see the payments possible on a "small business" scale of things.

pookeh 2 days ago 1 reply      
Some people here are commenting on how bad the idea is. There are worse ideas out there that did and continue to do so much better.

The problem here I think was defining the MVP with a timeline. They could have just started with some virtual currency (like a point system) ... kids don't care about money or points as long as they accumulate something. This virtual currency or point system could have been used by parents to say to their kids "if you get 20 points I'll hand you a $20 bill".

You can make an app like that pretty fast, independent of any "partner ecosystem" and then if the idea lifts off then invite the big players into your ecosystem.

Everything they did was opposite.

themartorana 2 days ago 0 replies      
My comment is late, but the thing I take away from this (and I'm biased) is that the VC-first track is as terrible as it's been made out to be for years. Now they have a real business, helping people for money, and all the crazy shit they had to deal with as a start-up isn't a factor now. They've bootstrapped, successfully, and NOW would be the time to start working on a product.

I get it, opening another consultancy isn't the most glamorous thing, but man it gets you to solvent quickly. Then you build a product in your off time WHILE being your own boss. The flexibility is awesome.

That's what we did. Somewhat accidentally, but it's awesome. Now we are a product company with no investors and 100% equity.

Now that's not to say that someone shouldn't take money - but do it now, after you're a business, after you stop wondering where the next dollar is coming from.

It's not a bad way to go.

Cozumel 2 days ago 2 replies      
So basically this was a 'to-do' app like you can find on any beginners tutorial (quick Google example: https://www.thepolyglotdeveloper.com/2015/03/create-todo-lis...) the only difference was banking integration which would have been trivial to add in, they could use paypal, stripe, take the payment themselves and pay the kids out in apple credits etc

Somehow not only did they convince some poor sap to give them $100,000 for it, but that wasn't enough, they had to keep 'fund raising'. They had no outlay at all, just build the app and deploy.

I'm really struggling to understand the mindset here, not just of the developers who didn't need any investment to write that app, but the people who actually dumped money into it.

giarc 2 days ago 3 replies      
I love these type's of posts. Very raw, very honest. Sure it's a pitch for their new business but they deserve the free ad since they got me to read to the end.
MOARDONGZPLZ 2 days ago 3 replies      
I may be totally out of touch, but why would someone give someone else $100,000 for an app? And why wouldn't $100k be enough? By the end they needed $700,000 "bare minimum."

I just don't get how this can cost so much.

Making something like this in one's spare time while getting paid with a salaried "real" job seems like it would totally be sufficient to get moving.

But assuming it really takes $700k on hand to get these bank sponsors, why not just change the model slightly. For example, integrate with Square cash or something similar. Square is dead simple to use and the money arrives instantly, not sure if it's integrate-able in this manner though, but the concept stands.

And then maybe when the company is really successful with tens of thousands of subscribers, partner with someone to add direct bank account integration.

Sorry if this sounds callous because I really don't mean it to be, I'm just flabbergasted the amount of money involved for some app someone came up with randomly.

pedalpete 1 day ago 0 replies      
Two comments1) If the investor had another idea of how to make the product work, why did that not come up earlier? You should be working "with" your angel investor, not selling your work to him so he can run it.

2) You took the wrong lesson from this. "Don't even think about building your product" is the wrong mentality. Build it quick and get a few people using it. Figure out what you need to do to get people using it. The MVP for piggybank didn't include payments (possibly) or it had something else.

The founders here didn't find a way to make this product work. It's a great idea (I think) and now the first Angel investor is going to take it and make it work because for some strange reason the founders saw the hard road, the one which could break them and destroy their dreams and they decided to run straight into that wall with their eyes closed instead of finding a different route around.

hristov 2 days ago 1 reply      
They got really screwed by their first banking partner. This is when having a good lawyer would have helped.

The correct course of action would have been to politely tell the bank that they would not free the bank from their duties under the contract but would welcome any new introductions. Under their duty to limit their damages, the startup would do everything possible to find a new bank partner, and if they did the original bank partner would be free from their obligations without paying any damages.

This way you get the bank to help you but still be liable under the contract if things do not work out.

Of course nothing here is legal advice, every situation is different, so do consult a lawyer for your own situation.

paloaltokid 1 day ago 2 replies      
Echoing what many others on this thread have said - FinTech isn't just hard, it's insanely fucking hard. The technology piece is truly the easy part. Build the app, design a nice interface, and so on.

Then you have to deal with the banks, and yes, you absolutely must deal with the banks. There is simply no way around it. Anything having to do with money is heavily regulated and for good reason.

The first rule of creating a product that involves money is that people will look for ways to exploit it (read: fraud). There are many great FinTech startups out there right now wanting to do great things and the simple truth is that it is orders of magnitude harder than just about anything else. Barring, maybe, sending people to Mars or disrupting the car industry. :)

theinternetman 1 day ago 1 reply      
Just seems a bit absurd to me the amount of effort put into an idea that can be replicated in your own home by putting money in a jar for the kid and then when it's full enough to buy skyrim or whatever put the money in your pocket and order it off amazon for them.

Get to engage with your child on a person to person basis too not parent via an app and servers.

Not to mention I really don't understand where all the investor money went with no product to show for it. Seen as the payment part never got fully fleshed out we're left with a several thousand dollar todo list...

megablast 2 days ago 1 reply      
It seems like they concentrated on the easy stuff (design, app, marketing), and ignored the hard stuff (banking). You need to do the opposite. Work out what is hard, get that started asap. Visit it every single day. When you are blocked and waiting, then you do the easy stuff.
jhwhite 2 days ago 1 reply      
This sounds like a neat concept but then again is this something that really needs to be automated with an app?

A friend of mine made a kanban board for chores. There's some chores that have to be done within a week, and there's some that are optional. The optional one has money clipped to it and when the card for that chore is done the child can take the money. And the mother is the one that moves the card to done.

So kind of the same thing as this app, but a physical board. And I don't see how moving this to an app really enhances or disrupts the physical board.

Zelmor 1 day ago 0 replies      
No better way to ruin children's motivation to do something for its own sake, than paying them for chores. Nothing personal, but I'm glad thjs project flopped. So many children are better off without external monetary motivators. Those will lead to disaster in their lives and 9-5 jobs they hate.
ultimatejman 1 day ago 0 replies      
No MVP.No talking to users.No strategy.No surprise they failed.

MVP - build an app with parent login, child login and a points system for chores.

Talk to users (parents) by having chat in the app.

Find how to acquire users cheap, retain them, and make them pay. Go for a bank partnership when it is a compelling business proposition. Banks like money, not startups with no profits and no users.

orasis 2 days ago 1 reply      
The lesson here friends is to fail faster. The lack of momentum on the kickstarter should have been a clear red flag.
Johnie 2 days ago 4 replies      
A former Googler just launched www.nickel.co, which is an app for kids to manage their allowance. Their backing bank is Sutton Bank.

My guess is one of the partners that they were referring to is CorePro (http://corepro.io/) which was spun out of SmartyPig (https://www.smartypig.com/). Their backing bank was Lincoln Savings Bank.

One of the consumers of CorePro was Qapital (qapital.com). It seemed like recently, they switched models from Lincoln Savings Bank's individual account to Wells Fargo's custodial account.

In any case, if you're doing anything in consumer FinTech, the technology/product is the easy part. Banking partnership, customer acquisition, and unit economics is the hard part. Focus on solving the hard part first before building the technology.


One thing that they could have done for an MVP is a bring-your-own-bank-account model.

1) Parents would open a bank account for their kids (or the kid may already have their own bank account)

2) Integrate with Plaid Connect to get ACH info.

3) Use Dwolla ACH API (free) to transfer funds between parent's bank account to kid's bank account. (The downside of this is that this is a 2 legged ACH transfer which may take up to a week -- but for this use case, it shouldn't be a problem.)

This will get them the same value proposition that they would have had with bank integration. Once they prove out the demand, they can then optimize:

1) [Reduce ACH lag] Use an actual ACH processor that will handle the transfer between accounts. The downside is that this is expensive.

2) [Reduce ACH cost) Use a real bank (Wells Fargo, Chase, Citi) as an ACH processor -- this cuts out the lag in transfers. The catch to this is that to work with these banks, they want to see sufficient activity

3) [Eliminate need to bring your own bank] Partner with existing bank to provide account opening service. If you can prove sufficient activity and traction, many regional banks are interested in a lower cost customer acquisition channel and innovative products. Alternatively, use custodial accounts and they can earn float on the deposits.

What they were trying to do is jump to #3 before proving out the demand. Secondly, banks are approached all the time by startups to do these types of "partnerships". To banks, unless you can show traction or sufficient volume, it's not worth their time to work with you. The large banks (WF, Chase, etc) don't see much business from this area. The small banks don't have the resources or risk tolerance to do this. So you need to find the sweet spot of banks to establish these relationships. There are a number of regional banks that are willing to work with startups (tip: scout the ToS of these fintech startups to see who they have partnership with)

I'm not sure what their monetization strategy was going to be, but the unit economics here don't really work out for low deposit amounts.

(I've spent a lot of time research this area in FinTech)

rjbwork 2 days ago 1 reply      
One of the things I see a lot of people screw up on is trying to get "traction" and buzz and virality going BEFORE they actually have something that is fully functional in people's hands. Seems to me that if your buzz dies out before the product is actually ready to be used in it's basic form...you're screwed.
forgotpwtomain 1 day ago 1 reply      
I really find it hard to empathize with these kind of founder stories -- what I see is a technology that's trying to solve a problem where a problem doesn't really exist and then the frustration of failure. Maybe find a problem to solve that actually matters?
harel 2 days ago 2 replies      
In the UK we have gohenry.co.uk. They partnered with Visa and the kids get a debit card, chip&pin and all, that I as a parent can set how and where they can spend their money, auto weekly allowances, chores and the payoff for them etc. Its works quite well, and I pay a few quid a month for the privilege.
elif 2 days ago 3 replies      
The way I would have done it:

1) Parents are charged via typical merchant mechanism to fill their account with "credits" worth $0.99 each, costing $1.00, and can optionally allow the app to automatically top-up their account when low.

2) The service sends cash in a registered mail envelope at a frequency configurable by users, with mailing fees paid for by credits.

Then the business can function and grow a user-base, and by holding the balance, the company can build up a lot of cash-on-hand which will enable them move onto the bank integration they originally wanted. Bonus: receiving legitimate mail and getting cash in hand are very exciting experiences for a kid.

brett40324 14 hours ago 0 replies      
What parent really wants to complicate the fundamental lessons and experience that chore assignment and reward teach their kids?

Every thing, process, or task out there isnt always improveable by an app. Maybe all that wasted capital could of gone to kids who wont ever be given an allowance?

markbnj 1 day ago 0 replies      
I did a payment card thing back in the 90's with a couple of friends, during the first Internet wave. We actually got some traction and signed a few bank partners, but ultimately it is very hard to be disruptive when you require access to a massive, privately-owned network that is controlled by the same institutions you might be threatening. All you have to do is set eyes on Mastercard's marble palace on the Hudson to know who is calling the shots in that world.
nhangen 1 day ago 0 replies      
I came into this thread and expected to read encouraging and/or consoling comments, and instead all I see are people bashing the idea and its execution.

Of those bashing this business, how many of you have actually built something of your own, and of that group, how many of you have become successful doing it?

If you look hard enough, you can find flaws with any business concept. It's the entrepreneur's job to stay alive long enough to recognizing and fix those flaws. You don't fix them on day 1. Sometimes it takes years, even decades.

How about instead of tearing these guys apart for trying to build a business, we congratulate them for going all-in and paving the way for future entrepreneurs to get this concept right?

swalsh 1 day ago 0 replies      
My backseat quarterbacking here... I feel like these guys focused on the wrong things, the fascinating thing though is that with all the help they had, no one told them.

I would have got a marketing person in on day 1. Who needs a "back-end engineer" if you have no users using the product. I wouldn't have rewrote the app. Who cares how maintainable the app is if you have no users. I would have looked for a more realistic back-end route that can work today while I work on the bank integration (why not top up gift cards? I've paid people to write blog articles for me in Amazon Gift Cards. They're as good as cash.) basically, just keep plowing until get from zero to one. Even if one looks NOTHING like your original vision.

mywittyname 2 days ago 3 replies      
I don't understand why they gave up the company instead of pivoting to the cashless approach that the original investor wanted anyway.
bikamonki 1 day ago 0 replies      
Great post-mortem thanks for sharing.

Did you try a model where your app would be a great vehicle to sell (yet another) credit card to parents plus additional cards to their kids? In that model transfer transactions would be internal to one company and maybe they would already have an API to interact.

In my country risk investment or VCs are nonexistent so every time I think about funding one of my ideas a possible option is to use a big business where they serve as the distribution channel while the app serves to drive sales. In telcos for instance, apps could be a selling point in corporate sales; say a job tracker that helps a telco nail a contract with a contractor firm.

hoodoof 2 days ago 0 replies      
This story rings a bell on so many levels.

Not because my stories are the same, but because the entrepreneurial journey is so long.

You have to try over and over and over again and keep learning every time and every time it fails for a new reason there's a new lesson.

eljimmy 2 days ago 0 replies      
It sounds like they were near bankruptcy right before they gave up the app.

How do you go from near bankruptcy to starting a consultant firm and generating income in your first month? Would love to hear that part of the story.

ohazi 1 day ago 0 replies      
> We were behind on payroll taxes

Don't ever do this:


rdlecler1 1 day ago 0 replies      
The irony here is that their banking partners may have had cofidence in the company if YC had accepted them, bringing in more investors. You just can lean startup everything, and yes, money can often buy you what you need.
sakopov 2 days ago 1 reply      
Something I see a lot of in my area. A couple of people, usually friends, pair up for an app or a saas idea. It falls through for one reason or another. A little later the same folks end up launching a web dev agency, work on random contracts and make a ton of money. I have no idea why more people don't just pursue that to begin with. I have a friend who makes very good money doing this while keeping his full time job.
sgtpepper 2 days ago 1 reply      
So... ChoreMonster but with cash?
HillaryBriss 2 days ago 0 replies      
At least the lunch looked pretty good
homero 2 days ago 0 replies      
Reminds me of knox payments which seems abandoned
shitgoose 1 day ago 0 replies      
Liberty Reserve would have solved all your problems, but Budovsky rots in jail. And you guys go through emotional roller-coaster with banksters who got lost in their own regulations.

Another observation - you had a great idea that delivers clear value. Didn't happen. Now, by looking at your web site, you quite successfully sell some bullshit to other bullshitters who sell their shit to VCs, who sell their shit to innocent bystanders. Hell of a transition.

mahyarm 1 day ago 0 replies      
Andddd shit like this is why bitcoin was invented.

Fund via coinbase!

Thanks for the post mortem.

graycat 1 day ago 0 replies      
Sounds like "Don't count chickens before they hatch" or, inthis case, don't allocate timeor money or sign deals or promisethings before are darned surecan do them -- better yet, have those things already done.

Or basically have a plan that hasthe startup go live and get revenueenough for break even before makingpromises. Then, when one of the unexpectedthings happens, the launch date is justpushed out another month or two butdon't get into trouble with investors,bankers, lawyers, employees, etc.

Show HN: BitKeeper Enterprise-ready version control, now open-source bitkeeper.org
381 points by wscott  5 days ago   303 comments top 34
bcantrill 5 days ago 8 replies      
The grand irony is that Larry was one of the earliest advocates of open sourcing the operating system at Sun[1] -- and believed that by the time Sun finally collectively figured it out and made it happen (in 2005), it was a decade or more too late.[2] So on the one hand, you can view the story of BitKeeper with respect to open source as almost Greek in its tragic scope: every reason that Larry outlined for "sourceware"[3] for Sun applied just as much to BK as it did to SunOS -- with even the same technologist (Torvalds) leading the open source alternative! And you can say to BK and Larry now that it's "too late", just as Larry told Sun in 2005, but I also think this represents a forced dichotomy of "winners" and "losers." To the contrary, I would like to believe that the ongoing innovation in the illumos communities (SmartOS, OmniOS, etc.) proves that it's never too late to open source software -- that open source communities (like cities) can be small yet vibrant, serving a critical role to their constituencies. In an alternate universe, might we be running BK on SunOS instead of git on Linux? Sure -- but being able to run an open source BK on an open source illumos is also pretty great; the future of two innovative systems has been assured, even if it took a little longer than everyone might like.

So congratulations to Larry and crew -- and damn, were you ever right in 1993! ;)

[1] Seriously, read this: http://www.landley.net/history/mirror/unix/srcos.html

[2] The citation here is, in that greatest of all academic euphemisms, "Personal communication."

[3] "Sourceware" because [1] predates the term "open source"

dsr_ 5 days ago 3 replies      
For people who don't know the history -- McVoy offered free bitkeeper licenses to various open source projects, and the Linux kernel switched to it.

After Andrew Tridgell (SAMBA, among other projects) reverse-engineered the bitkeeper protocol [1] in order to create his own client, the license was rescinded for everyone.

As a result, Linus wrote git.

[1] https://lwn.net/Articles/132938/

luckydude 5 days ago 0 replies      
Lots of cross platform goodies in there as well as some interesting data structures. For example, our list data structure is in lines.c, it's extremely small for a small list and scales nicely to 50K items:


to3m 5 days ago 1 reply      
1 year ago: https://news.ycombinator.com/item?id=9330482

What changed? Is BitKeeper still an ongoing business with some other model, or is that, as they say... it? I hope not.

civilian 5 days ago 1 reply      
I have some questions about Why.html: https://www.bitkeeper.org/why.html

> Spending a lot of time dealing with manual and bad auto-merges? BitKeeper merges better than most other tools, and you will quickly develop confidence in the quality of the merges, meaning no more reviewing auto-merged code.

Do you have examples of merge-scenarios that are a Conflict for git but resolve for BK?

> BitKeepers raw speed for large projects is simply much faster than competing solutions for most common commercial configurations and operations especially ones that include remote teams, large binary assets, and NFS file systems.

Is there a rule of thumb for what size of repos benefits from BK? (And I suppose size could either be the size of a current commit or the total size of the repo.)

Are there any companies like github or bitbucket that support BitKeeper repos?

stephenr 5 days ago 0 replies      
Amongst all the "too late I loves me some git" type comments, i figure I'd say thankyou and good luck with continued revenue.

I haven't read much about bk so far, so forgive my lazy web question: does/can bk operate over standard ssh as git/hg/svn can, or does it require a dedicated listening server to connect to?

Edit: answering my own question, yes it does support ssh as a transport

kazinator 5 days ago 2 replies      
How does BitKeeper scale to large projects? (Like, say, gigabytes of binaries.) This is a weak area of Git.


From the "Why" page:

BitKeepers Binary Asset Manager (BAM) preserves resources and keeps access fast by providing local storage as needed.

BAM is great for any organization that handles:

* Videos

* Photos

* Artwork

* Office files

* CAD files

* Any large binary files

teddyh 4 days ago 0 replies      
BitMover still holds all the copyright, and have all the developers. They obviously wanted to keep BitKeeper proprietary, and are only doing it now when facing irrelevance in the marketplace. If BitKeeper becomes popular again, whos to say they won't take development proprietary again? Sure, the community could fork the latest free version, but there isnt a free development community for BitKeeper theyre all internal to BitMover.
educar 5 days ago 1 reply      
I half-expected 'very late' comments before I read the comments. I wasn't disappointed.

For those who commented that way, please reconsider this winner takes all approach to your outlook of the world. The world is better because of choice and it's in everybody's best interest to have more distributed version systems.

adrianN 5 days ago 1 reply      
Why would I want to use this over git or mercurial?
paradite 4 days ago 1 reply      

 $ bk clone bk://bkbits.net/bkdemo/bk_demo $ cd bkdemo # edit files using your favorite editor $ bk -Ux new $ bk commit -y"Comments" $ bk push
As a user whose first CVS was git, I am quite confused by this "quick demo", I have no idea what "-uX" means, no idea what "new" means, no idea what "-y" means and why it is immediately followed by quotation marks instead of being separated by a single space. If bk wants to get new users onboard, it needs a better quick demo that makes sense to new users.

qwertyuiop924 5 days ago 1 reply      
Too late to dominate, but maybe not too late to cut itself a niche. It seems to have some advantages over the competition, and appears to be a reasonable contribution to the table. Besides, competition is always good.

At the very least, Bryan Cantrill will be happy :-D.

sspiff 5 days ago 1 reply      
I'm wondering: how does it handle large binary files? Any better than git or hg without extensions?
jeremycole 5 days ago 1 reply      
Huh. Thanks for doing this. As a MySQL employee in the early days I used BitKeeper and fell in love with it and kept using it as long as I could. I mainly use Git these days, but frequently miss BitKeeper -- BK felt a lot more natural to me than Git ever has.
devnonymous 5 days ago 1 reply      
Great news! Better late than never! I hope they (or a client of theirs) create a BK backed service soon. I for one, think we need more than just github and altassian in the market if only to ensure the businesses don't take their users for granted (hint: sourceforge)
rburhum 5 days ago 1 reply      
This is very cool... but also, kind of a bit late. The market already adopted git and the momentum is there. Unless there is a trivial way to switch back and forth from git or there is something that is orders of magnitude better, this is a decade too late.
PuercoPop 5 days ago 2 replies      
Something I'm wondering and the man page doesn't clear, does it track files across renames or does it only track content like git?
drewg123 5 days ago 1 reply      
Can it import from git or SVN or mercurial?

Looking at the bk import man page, it looks like it cannot import from any modern VCS. I see only RCS, SCCS, CVS, and MKS as options. This is unfortunate, as I have a mercurial tree I'd like to import.

jordigh 5 days ago 0 replies      
Well, that took a long time... I wonder what changed in the eleven years that Git and Mercurial were deployed to replace bitkeeper.
Annatar 4 days ago 0 replies      
The biggest feature for me is the efficient handling of large binary files, because it means I could finally have a completely self-contained repository (clone and everything is in one place, plus free replication), but without the performance penalties which for example Mercurial incurs with binary files:


I have to try it out just for that!

okket 5 days ago 1 reply      
There is an official mirror on GitHub:


ausjke 4 days ago 0 replies      
This predates git, in fact if it was open sourced from the start git may never have existed, sigh, how ironic.

If bitkeeper was open sourced it could be a powerhouse nowadays, open source and commercially. Now it is too late and honestly irrelevant.

kingosticks 4 days ago 0 replies      

 "The ability to seamlessly share only a subset of your source tree "
I've spent a good 10 mins trying to find anything specific in the documentation about this but come up empty. Is this just by virtue of using submodules, ssh and filesystem permissions or is there something more that I'm yet to find? The lack of fine grain security on modern VCS systems is one of the reasons our monolithic repository is still using CVS.

On a related note, the getting started documentation should be more prominent on the Web page.

prirun 4 days ago 1 reply      
I think the same points made in Larry's 1993 paper could be made about various Linux distributions:

 Why a gazillion package managers? Why not a common filesystem layout? Why not a standard desktop?
IMO, Linus should enforce his Linux trademark by forcing every distribution to follow a set of standards. If they don't, they can't call it "Linux". If he got them in a room and said "This is the way it's going to be, or else", they'd do it.

foreign-inc 5 days ago 0 replies      
Some history from Linus himself https://www.youtube.com/watch?v=4XpnKHJAok8
loeg 5 days ago 3 replies      
Interesting FreeBSD 7 and 8 binaries available for download. Neither of those is a current supported release. It's like offering RHEL 3 or 4 binaries.
rdtsc 5 days ago 0 replies      
I see this as "features" https://www.bitkeeper.org/why.html

See large repo support, security and others.

Is that geared towards comparing with Git/Github? Is there a more focused comparison with those. i.e. both comparing to git itself and to GaaS (Git as a Service).

paulasmuth 5 days ago 1 reply      
The nested repository feature sounds amazing. Dealing with both git submodules and git subtrees has been a huge pain for me.

I'm looking forward to trying this out over the weekend. Is there some kind of util/script to import history from git?

jwilk 5 days ago 2 replies      
"[...] Linus moved to it and most of the developers followed. They stayed in it for three more years before moving to Git because BitKeeper wasn't open source."

Um, the "because" part is not quite right.

benjarrell 5 days ago 1 reply      
Does this come with any sort of web interface?
gbraad 5 days ago 0 replies      
Great to see this finally happen... However, for 'us' Git remains a keeper.
talles 5 days ago 0 replies      
Too late?
ashitlerferad 5 days ago 1 reply      
Too late :)
ezoe 4 days ago 0 replies      
It's too late.There is no reason to use non-git DVCS in 2016.
ZFS lands in Debian contrib debian.org
271 points by turrini  1 day ago   177 comments top 13
dakami 1 day ago 8 replies      
So I just started experimenting with ZFS, because it seemed required for container snapshots.

Then I found out it fragments badly, and nobody can figure out how to write a defragmenter. So, uh, keep the FS below 60-80% full apparently.


dmm 1 day ago 2 replies      
It appears that the kernel-level code is shipped as source to be built, automatically by dkms, by the end user. Check out the list of binaries on the bottom left of that page.

This means that no binary kernel modules are shipped, just the cli tools.

espadrine 1 day ago 6 replies      
Has there been updates on the legal situation since http://blog.halon.org.uk/2016/01/on-zfs-in-debian/?

I am obviously glad that this happened, but afraid of an Oraclocalypse.

Nursie 1 day ago 1 reply      
Excellent, have been running with some ubuntu ppa stuff for a while now, and that's great but things occasionally break. Looks like soon I can ditch it for pure debian again.
jordigh 1 day ago 1 reply      
contrib is a funny place for it. Normally contrib means free software that depends on non-free software. In this case, it seems to have acquired the meaning of free software has a license incompatibility with other free software. I wonder if we have heard the last of CDDL vs GPL.

Technical solutions to legal problems don't work, just like GPL wrappers don't work (at least, that's what some lawyers say). If Oracle decides to make a stink about this, they still can.

edit: Huh, apparently last year Debian actually got advice from SFLC about this:


l1ambda 1 day ago 1 reply      
Great news. So this means Ubuntu, Debian and the new Redox OS now have ZFS. I would love to see it officially supported in Fedora too.
mrmondo 11 hours ago 0 replies      
This is interesting to see at a time when so many key packages are missing or badly outdated in Debian core.
grigio 1 day ago 8 replies      
So is btrfs dead?
Eun 1 day ago 1 reply      
finally, hopefully it makes its way to the installer (unlike the ubuntu installer...)
4ad 1 day ago 3 replies      
Unfortunately, it's a dkms, which means it gets compiled on the user machine on update.

From an operational perspective, this is insane, I need reliability. Of course in my organisation I could create a binary package and use that, but that's more work and then the new Debian package doesn't help me anyway.

When I need linux I just run ZFS on a better supported system and either virtualise Linux or expose an iSCSI target from ZFS for Linux.

gjvc 1 day ago 2 replies      
massive opportunity now for Oracle to generate some goodwill
yc-kraln 1 day ago 1 reply      
so... what's missing?
sobkas 1 day ago 1 reply      
But it didn't land in Debian. It only landed in contrib. Title of this link is wrong, so maybe someone should fix it?


Pennsylvania license plate reader SUV camouflaged as Google Street View vehicle vice.com
323 points by uptown  3 days ago   126 comments top 16
ChrisBland 3 days ago 3 replies      
I think this could be one of those really tasteless jokes that the people who are stationed in truck made. I'd imagine it gets pretty boring in the car, and someone thought it would be funny. It probably is funny to them, but taken out of context it looks really bad.
jscheel 3 days ago 0 replies      
It's like a modern take on the "cable company" FBI van. Except now local law enforcement is able to deploy advanced mass data-gathering technology without any proper training or framework for handling said data appropriately.
gruez 3 days ago 3 replies      
Google should sue them for misusing their trademark.
partycoder 3 days ago 3 replies      
Note that streetview cars are not only for maps, they're also wardriving cars. They collect information on Wi-Fi networks, so then you can map a router MAC address to a physical location. They used to sniff wifi traffic too.

There was a tool (http://samy.pl/mapxss/) that allowed you to interface that system. When I moved to a new home, my router appeared on the old location for a few weeks, and then it got updated to the new one. Creepy.

optimuspaul 3 days ago 1 reply      
Seems like they are suggesting that Google Streetview cars aren't government spy vehicles. I had never come to that conclusion myself.
at-fates-hands 3 days ago 7 replies      
This is actually pretty terrifying when you follow this to its logical conclusion.

You're going after license plates, which are regulated by state governments. It's illegal to obfuscate them, so there's no way to conceal your identity like you can with some the CCTV stuff. It's essentially an easy, legal way to keep tabs on your population. As the article pointed out, you can also tie this to all kinds of available data, and start creating profiles for people.

This is really scary, scary stuff.

hackuser 3 days ago 2 replies      
The Democratic National Convention will be in Philadelphia in the next few months. Possibly it's related to that.

If tracking citizens is legal and ok, then why do they have to hide it?

abeppu 3 days ago 1 reply      
This crowd is already aware that the public is more willing to entrust broad, deeply personal datasets to profit-seeking corporations than to a government which ostensibly serves them. But the people running the surveillance state seem to always live in a parallel unreality, where their work is presented as unimpeachably noble and necessary. To me, the silver lining in this story is the indication that the people operating the surveillance machine understand that we find their work strictly more creepy than the data collection conducted by non-transparent, unaccountable, explicitly self-serving corporations.
mgrennan 3 days ago 0 replies      
When and how did "Protect and Serve" become "Sneek and Spy"?
unabridged 3 days ago 4 replies      
I find it strange people here are worried about license plate readers (which only record slivers of your location data) while at the same time they carry a phone that transmits realtime location data to many companies (the manufacturer, your provider, google or apple, and plus all the apps you have installed). And if your car is less than 10 years old it probably has a phone installed in it doing the same thing.
rrggrr 3 days ago 0 replies      
If the data they're collecting isn't being used in an active investigation then its probably subject to FOIA requests. So, perhaps the effort to keep it under wraps is to avoid a flood of FOIA requests for the data.
kelvin0 3 days ago 2 replies      
It's a bizarre to see how many people are upset when gun laws even hint at some regulation, but being spied on and loss of privacy is not even close to being an issue for them. Isn't encroachment on personal lives a threat many times worse?
london888 3 days ago 0 replies      
It's a needless way to alienate State and citizenry.

Just say what it is on the outside.

pitt1980 3 days ago 0 replies      
doesn't seems like a very good disguise

how often do you see a google streetview car?

shouldn't about 1 or 2 passes a year get them the info they need?

if you saw a google streetview car in the same neighborhood, 5 or 6 times in the same week wouldn't it seem highly unusual?


seems like someone got overly cleaver to me

that the apparent sanctity of Google Streetview Cars was violated, and that people seem upset, seems utterly laughable to me

frgewut 3 days ago 3 replies      
Makes me wonder what will happen when an average phone will be able to do ANPR easily...
sickbeard 3 days ago 4 replies      
what's wrong with government spy truck? Did someone come to the sudden realization that there are sanctioned government agencies whose sole job is to spy?
Apple invests $1B in Chinese ride-hailing service Didi Chuxing reuters.com
299 points by ssclafani  2 days ago   123 comments top 22
devy 2 days ago 5 replies      
There were a few ideas floating around Chinese tech blogs, of the most possible reasons are:

1. Integrate Apple Pay into Didi Chuxing apps, which is estimated to have 300MM users in China[1]. Didi currently only accepts payment with Weixin(aka WeChat) Pay and Ali Pay (aka Ant Financial Services Group, owned by Alibaba Group).

2. Massive data points for developing Apple's self-driving technology. Didi operates in 400 Chinese cities with over 11 million rides per day, and accounts for 80% private car hailing market and 99% taxi hailing market.[2]

3. And yes, investing into Chinese tech sector give them a better leverage in negotiations with the government and also like sbuccini said, association with other Didi's major investors.

[1]: http://www.theverge.com/2016/5/12/11669178/apple-invests-1-b...

[2]: https://en.wikipedia.org/wiki/Didi_Chuxing

sbuccini 2 days ago 8 replies      
Apple has huge stores of cash sitting overseas. They can't bring it home without subject to large tax penalties. With that cash stockpile growing, it seems like they're having trouble finding ways to put that money to work overseas. I wonder if they would have still made this investment if they had the ability to bring that money back stateside without hefty tax liabilities.

Regardless, a $1B is not a small chunk of change, even for Apple. Clearly, natural synergies could arise when Project Titan matures. But Apple tightly coupling itself with a rising player in the Chinese tech sector is a smart play (not to mention associating itself with other notable Didi investors like Alibaba). We've seen similar moves by Uber, who took a large investment from Baidu.

Interesting times ahead.

salimmadjd 2 days ago 2 replies      
Apple Map/Siri and Google map eventually will become the interface to ordering car sharing. Especially once there is autonomous cars and Uber or Lyft do not own the driver-side of the equation, ride sharing will become a bit like ordering rental cars from Kayak.

The same way Facebook is becoming the interface to content, both google and apple will own the consumer side of the cars and help you pick the best deal or cheapest option. It could be from Uber, a guy who owns a fleet of 20 autonomous cars or Hertz, etc.

This moves totally makes sense for Apple as they will move to own the consumer side of this market.

aresant 2 days ago 2 replies      
Are they investing in Didi the ride sharing company or investing in Didi, Uber's arch rival in China?

The chess game afoot in the autonomous vehicle battle is attracting some strange bedfellows between Apple, Google, tesla, ford, Mercedes, Uber, Lyft, Volvo, Nvidia, etc.

The end game is outrageously big, generationally big, and it's going to be a treat to watch the Titans lock horns.

zer00eyz 2 days ago 3 replies      
Apple didn't do this because they are building a "self driving car" that they are going to sell.

Apple wants to get ibooks and movies selling in china again, it is vital to apples car strategy. This move will give them some leverage with china in getting those markets back to being active.

If you know anything about Tim Cook, you know that he is master of the supply chain. I don't think that someone like that is going to jump into apple building its own car.

So if apple isn't going to build a car, what ARE they doing with all these people on the pay roll who have worked with cars.

Its simple, apple wants to own the dashboard of the car, were not talking about "carplay" were talking about the WHOLE dashboard. Once you own the dashboard, your hooked into location, and destination (apple owns a mapping solution) they can leave it to vendors of vehicles to do the "self driving" compontent.

Why would any automaker want apple in the dashboard? Why would apple want the dashboard? Its simple, entertainment! Apple with the beats acquisition owns something that looks like radio, and music has always been there with iTunes. There is no reason you can't rent movies and books into the back seat as well.

Aelinsaar 2 days ago 0 replies      
It must be nice to have so much cash that a $1B stake can be seen as somewhat speculative. Incredible really.
electriclove 2 days ago 2 replies      
This is about gathering real world data to further autonomous driving.

Google has their small fleet collecting data; Tesla has tons of vehicles now collecting data; Uber has the potential to start collecting data. Apple has no ability to collect real world data... Until now?? Smart move Apple

radicsge 2 days ago 2 replies      
I hardly doubt this money is for real investment most probably just to please to government, this 1B is eventually will be distributed among citizens without a job. Also the big cities became super crowded as being extremly urbanized. In some days traffic jams starts at 3 and between 5-7 everything is just stuck. It might be also a way how goverment try to slightly reduce these issues.
molmalo 2 days ago 1 reply      
While I do believe that they are investing in Didi with the dual intent of gaining access to their massive riding data, and position themselves as providers for future autonomous vehicles, I also believe (and I could be very wrong) that they may be funneling money through China, for their US-based facilities.

I mean, could this be related with Faraday Future? The mysterious "US-based, Chinese-backed company" that plans to invest $1B in California, "focused on the development of intelligent electric vehicles and mobility solutions" [1], that many suspected it's a front for Apple's car.

[1] https://en.wikipedia.org/wiki/Faraday_Future

desireco42 2 days ago 1 reply      
And that is it. Owning a piece of such large Chinese company, Apple also gets a foothold into decision making in China and gets more say. Very smart.

Jobs was a different person, Tim Cook leads company differently. I really like how he plays this.

kirykl 2 days ago 0 replies      
With the scale of Chinese companies is $1b a large investment? The article says Didi has already raised several billion
iamgopal 2 days ago 4 replies      
How economics works ? 1B$ for an app ? Innovative Small scale business are struggling to raise a million, and blatant copy cats raising billions ? Capitalism works correctly tho, what they have earn has to give back to earn more.
eddieplan9 2 days ago 0 replies      
From the article:

> "(The deal reflects) our continued confidence in the long term in Chinas economy," Cook said.

If this is not kowtowing, I don't know what is. This is very disappointing coming from a company like Apple.

jackieluo 2 days ago 0 replies      
I haven't been this surprised by anything Apple's done in years. If moves like this one keep happening, the next few years are going to be pretty exciting to watch.
swyman 2 days ago 2 replies      
Probably dumb hypothetical: What happens if Apple comes up with an excuse to remove Uber from the (Chinese) App Store and corresponding iPhones?
davidiach 2 days ago 1 reply      
"The company said it completes more than 11 million rides a day, with more than 87 percent of the market for private car-hailing in China."

If they make $1.00 for each ride, that's already more than $4 billion in revenue per year.

free2rhyme214 2 days ago 0 replies      
I was just starting to doubt Apple, especially with their lackluster effort in artificial reality, and now I'm doing an about face.

This is an incredibly smart move and a great long term play which will bode well for Project Titan.

chj 2 days ago 0 replies      
One thing to be sure, uber will have a hard time in China.
dingo_bat 2 days ago 0 replies      
Is there any reason Didi Chuxing is valued higher than Uber. Uber is the innovator and present all around the world. Why is it still lower-valued than Didi?
qaq 2 days ago 0 replies      
Apple to follow in Yahoo footsteps skip 10 years this might be the most valuable part of Apple :)
sangd 2 days ago 0 replies      
It looks like Apple wants to get the Chinese super star startup(s) to depend more on their giant pile of oversea cash; thus to protect their brand, make their strong hold on this market which is influenced very much by the state. Look at their stock today, Google is surpassing and it has no other future rather than protecting its iPhone, iPad, Macs which are pretty much saturated.
JayeshSidhwani 2 days ago 0 replies      
Could this also be because Apple would want to test their experiments in self-driving cars?
Teaching C regehr.org
364 points by mpweiher  5 days ago   152 comments top 21
robertelder 5 days ago 11 replies      
In my quest to learn C very well over the past few years, I've come to the conclusion that C is best understood if you think about it in terms of the way that an assembly language programmer would think about doing things. An example of this would be if you consider how switch statements work in C. Switch statements in C don't really compare to switch statements that you find in other languages (eg. https://en.wikipedia.org/wiki/Duff%27s_device).

The issue that many students face in learning low level C, is that they don't learn assembly language programming first anymore, and they come from higher level languages and move down. Instead of visualizing a Von Neumann machine, they know only of syntax, and for them the problem of programming comes down to finding the right magic piece of code online to copy and paste. The idea of stack frames, heaps, registers, pointers are completely foreign to them, even though they are fundamentally simple concepts.

dbcurtis 5 days ago 2 replies      
Let me add the perspective of an old dinosaur that learned to program before C had been invented.

C maps nearly 1:1 onto simple processor and memory models, and most importantly, gets out of your way and lets you get on with solving your system programming problems. Before C, just about any meaningful system programming task required a dive into assembly language. In that context, C was a huge win. It is also what makes C the langauge of choice for embedded development today.

Of course, system programming problems are not the bread-and-butter of most develpers today -- and a good thing, too. We can now build on top of solid systems and concentrate on delivering value to the customer at much higher levels of abstraction: the levels of abstraction that are meaningful to customers.

I dearly love Python because it allows me to work at levels of abstraction that are meaningful to the user's problem. I dearly love C when I want to wiggle a pin on an ARM Cortex-M3.

In my mind, CS education should start by teaching problem decomposition and performance analysis using a language like Python that provides high levels of abstraction and automated memory management. Then, just like assembly language was a required CS core course back in my day, students today should spend a semester implementing and measuring the performance of some of the data structures that they have been getting "for free" so that they understand computing at a fundamental level. Some will go on to be systems programmers, and will spend more time at the C level. Some won't ever look at C again, and that is OK.

In the end, CS education is about how to solve problems through the application of mechanical computation. The languages will evolve as our understanding of the problems evolve and our ability to create computing infrastructure evolves. CS educcation should be about creating people who can contribute to (and keep up with) that evolution.

parr0t 5 days ago 0 replies      
I'm currently at uni studying CS and recently finished my 'Programming in C' unit. The teacher from the get-go said it would be challenging compared to other languages that we had used to date (mainly Java) and that quite a few students struggle with it. Once I got my head around pointers and debugging through GBD/Valgrind the unit came immensely enjoyable and rewarding.

We didn't use any fancy IDE's and were told to stick to VIM, we also had to compile with the flags -ansi -Wall -pedantic which alerted you to not only errors but warnings when we compiled our code if it didn't meet the C90 (I think) standards. It was a lot of work crammed into 13 weeks but it had one assignment which I thoroughly enjoyed.

Tic Tac Toe (Ramming home using pointers, 2D arrays, bubble sort for the Scoreboard).

Debugging a bug-riddled program (My favourite).

Word Sorter (Using dynamic memory structures, memory management by having no leaks, etc).

The debugging one was very different from most other assignments I had done at uni to date and the teacher said he recently introduced this assignment because the university had received feedback that students debugging skills weren't the greatest. They could write what they were asked to just fine, but when it came to debugging preexisting issues quite a few struggled. We got given a program with around 15 bugs and you got marks depending on what was causing the bug and a valid solution to fix it. This forced us to use tools such as GBD and Valgrind to step through the program and see where the issue was and to be much more methodical.

I really enjoyed C and when I find a bit of time outside of work and study I'd like to explore it more.

fisherjeff 5 days ago 1 reply      
A long-standing gripe of mine: When I clicked through to his example "cute little function" in Musl, I found myself mentally adding comments to work through all the "cuteness". If that's the case, IMO, it's either too cute or needs more comments - not sure how much time I've spent picking apart kernel code just to figure out what the hell some of it does, but it's definitely not time well spent.

EDIT: Meant to add: fantastic article, wish my Intro to C instructor had read it...

fdej 5 days ago 5 replies      
The core of what makes C elegant is that basically everything that looks atomic is atomic, in the sense of taking O(1) space and memory (at least until C99 introduced its abominable variable-length arrays, and perhaps some other features I'm forgetting about).

Absent macro obfuscation, it is easy to reason about what a snippet of C does and how it translates down to machine code, even taken out of context. In C++, something as innocent as "i++;" could allocate heap memory and do file I/O.

The downside is that C code can become quite verbose, and to do anything useful, it takes a lot of ground work to basically set up your own DSL of utility functions and data structures. For certain applications, this is an acceptable tradeoff and gives a great deal of flexibility. I think teaching this bottom-up approach to programming can be quite useful - in a way, it mirrors the SICP approach, albeit from a rather different angle.

The question is, why are there not more languages that have the same paradigm, but also add basic memory safety, avoid spurious undefined behavior, provide namespaces, with a non-stupid standard library, etc.?

latenightcoding 5 days ago 5 replies      
Great post!Univiersites tend to teach a very small subset of C, just enough to make a tic tac toe application or something silly.

I learned C by myself many years ago but it's only until recent I have been using it for big projects.

Reading Redis' source code was a great aide, xv6 is also amazing to learn systems programming.

Learn C The Hard Way is also a good read, but not as your main book, since it goes too fast.Other invaluable resources are: Beej's Guide to Network Programming and Beej's Guide to Unix Interprocess Communication

A good advanced book is Advanced Programming in the Unix Environment

nickpsecurity 5 days ago 0 replies      
Although a C opponent, I find this to be a good writeup. I hope more C students see it. Particularly, the author focuses on introducing students to exemplar code, libraries with stuff they can study in isolation, and making habit of using checkers that knock out common problems. This kind of approach could produce a better baseline of C coder in proprietary or FOSS apps.

Only thing I didn't like was goto chain part. I looked at both examples thinking one could just use function calls and conditionals without nesting. My memory loss means I can't be sure as I don't remember C's semantics. Yet, sure enough, I read the comments on that article to find "Nate" illustrating a third approach without goto or extreme nesting. Anyone about to implement a goto chain should look at his examples. Any C coders wanting to chime in on that or alternatives they think are better... which also avoid goto... feel free. Also, Joshua Cranmer has a list there of areas he thought justified a goto. A list of great alternatives to goto for each might be warranted if such alternatives exist.

Only improvement I could think of right off the bat on the article outside including lightweight, formal methods like C or stuff like Ivory language immune to many C problems by design that extract to C. Not saying it's a substitute for learning proper C so much as useful tools for practitioner that are often left out. Astre Analyzer and safe subsets of C probably deserve mention, too, given what defect-reduction they're achieving in safety-critical embedded sector.

acbart 5 days ago 0 replies      
Although this is an interesting post, I'm disappointed from a pedagogical point of view. The article covers these topics, in this order:

1. What book do we assign?

2. What should we lecture?

3. What sort of code review work should we have students do?

4. What kind of assignments should we use? But only to say that he won't cover it in the article!

This is the almost the exact opposite order of what is most useful in terms of learning. Yes, some people (especially auto-didactic and well-focused students) are able to learn tremendous amounts on their own through books. But they are a relatively poor tool for teaching, compared to active learning methods. Lecture can be great, but usually is passive and worse than useless.

I want to acknowledge the importance of defining what you will teach and what successful (end-of-course) students look like and how to assess them. After you've decided that, it is proper to devise assignments and assessments, and then to decide on lectures and supplemental materials that support students in completing the assignments and assessments successfully. The time students spend should be active and practical - not that readings can't be provided, but they should be on-point and meaningful. Proper application of Instructional Design principles and theories of learning can make a world of difference for students.

But kudos for thinking about it, kudos for thinking about feedback mechanisms, and kudos for

PS: Obviously, I believe C has a great place in the curriculum - shouldn't leave undergrad without it!

s_m_t 4 days ago 0 replies      
I love K&R, 21st century C, Understanding C pointers, Deep C secrets but I think they are a little complicated for beginners. I wouldn't bother opening them until you have written a couple of small programs in C or you have good experience in other languages.

When I first learned C in highschool I got a few books on C which all seemed to have the word 'Beginner' in the name. 'Absolute Beginners Guide to C' is one I remember in particular. I think having multiple books is pivotal because as a beginner if you encounter an explanation that doesn't make sense to you it is very hard to reason around it. You probably have very little prior knowledge, almost everything you know and learn up to the point where you get stuck will be contained in that single book, and if you don't know any other languages you can't make any connections to help yourself out. The reason the second, third, or fourth book is so important is that it will have a slightly different explanation that might make something click in your brain.

haberman 5 days ago 0 replies      
I thought this was an excellent post. C has changed in lots of important ways in the last 10-20 years. The changes are both convenient (far better tooling) and inconvenient (much less forgiving of undefined behavior). Those of us who use C professionally have had to pick up most of these changes by osmosis. This was a really great run-down on how you'd bring a newbie up to speed with the state of the field.
kbenson 5 days ago 0 replies      
'"even what seems like plain stupidity often stems from engineering trade-offs"'

This has truly become something I try to keep in mind, considering a) I've later, sometimes long after starting on someone else's code base, learned a useful rationale for why they did some of the previously more inscrutable things in their code, and b) ended up writing a few things like that myself.

Documentation is key to understanding these systems, but it isn't sufficient. Often you are presented with a nicely documented mega-function, which while anyone can read through, but is very hard to reuse a portion of when needed. In breaking it apart into smaller chunks, you necessarily scatter some of the reasoning about why a particular approach was taken from where it was originally used, or at least where the weird behavior is required. You can either reproduce large chunks of the documentation at many different points in the code base, and hope it doesn't get out of date as the systems it describes in other files is slowly changed, or keep the documentation as fairly strictly pertaining to the code immediately around it, in which case the knowledge of how the systems interact can get lost.

Whenever you encounter code that seems to make no sense, it's better to assume there's some interesting invisible state that you need to grok, than that the programmer was an imbecile or amateur. The latter may be true, but assuming that from the beginning rarely leads to a better outcome.


I'll share my favorite example of this. At a prior job, we had a heavily used internal webapp written in Perl circa 1996. It was heavily modified over the years by multiple people, but by the time I was looking in on it in 2012, it was a horror story we used to scare new devs. The main WTF was that it was implemented as one large CGI which eschewed all use of subroutines for labels and goto statements, of which there were copious amounts. The really confusing part was that they were used exactly as you would expect a sub to be used, just with a setting a few variables and a jump instead, so we always scratched our heads as to the reasoning for this. There was even a comment along the lines of "I hate to use goto statements, but I don't know a better way to do this, so we're stuck with this."

Fast forward a couple years, and I'm migrating the webapp to a newer system and Perl, and I discover the reason for this. At some point it was converted to be a mod_perl application, and the way mod_perl for Apache works is to take your entire CGI and wrap it in a subroutine, persist the Perl instance, and call the subroutine each request. The common problem with this is that because of this any subroutines within your CGI can easily create closures if they use global variables. The goto statements really were intended to be used just like subroutines, because they were likely switched to in an attempt to easily circumvent this problem. Now, there are better methods to combat this, such as sticking your subroutines in a module, and having your CGI (and then mod_perl) just call that module, which is what I ended up converting the code to do, but the real take-away is that the original decision, as impossible to defend as it seemed, was actually based in a real-world trade-off, and at the time it was done may have actually been the correct call.

feklar 4 days ago 0 replies      
The Harvard CS50 course on edx does a pretty good job of teaching C, esp if you do the recommended reading/psets of "hacker level" which is from the book Hacker's Delight 2.

There is some initial magic, where they have you import cs50.h which is full of black box functions in the beginning but other than that it's a good example of teaching beginner C.

satysin 5 days ago 2 replies      
If part of a CS course I think C is an excellent first language. Perhaps not for someone wanting to learn about software development on their own though.

It seems to me that while we know how to teach C properly today not many places do because they don't do as they say.

lunchTime42 5 days ago 0 replies      
There is not one C. There are multitutdes of C. C is a recombination of the Programming language with the Compiler with the Plattform with the Code Convention of choice with the librarys chosen with the OperatingSystem (if there is one).

And C needs knowledge in all fields recombined to be really used freely. Know one of those fields not - and you will be like a wanderer on a frozzen lake, doomed to trust those who know to guide you by ramming posts of no return where the ice gets thin.

Its also about taking a sledgehammer to all those certaintys people have about computers from marketing and personal experience as consumers.

orionblastar 5 days ago 0 replies      
I find that many books that teach C either assume the reader knows how to program, or are too complex for them to understand.

I was going to write a Kindle book in the beginner's guide to C using Code::Blocks and its IDE because it is FOSS cross platform software. I found out it is a lot harder than I thought it was.

I learned C in 1987 at a community college still have the book on it that is written for Microsoft C, and we used Turbo C and Quick C for some of the assignments. Most of the programs I wrote can still compile and those that get errors or side effects can be debugged easily.

foyk 4 days ago 1 reply      
"This claim that positive signed overflow wraps around is neither correct by the C standard nor consistent with the observed behavior of either GCC or LLVM. This isnt an acceptable claim to make in a popular C-based textbook published in 2015."

Perhaps someone could explain what I'm missing. It's exactly the behavior that I see using gcc-4.8 and Apple llvm-7.3.

andrewfromx 5 days ago 2 replies      
CS degree from pitt.edu 1996 and C was not required. But a friend an I took it as an elective. We did not want to get out of school with CS degree and no C.
awinter-py 5 days ago 0 replies      
first piece in a while to make me optimistic about college curriculum priorities.

Not sure it's possible to teach green frosh 'why does industry use an old language' and the static analysis ecosystem (easier to teach skills than wisdom). But I applaud these people for trying. This feels like real programming.

Paul_S 5 days ago 0 replies      
Knowing assembly is a good first step and a prerequisite to be useful in an embedded project.
lil1729 4 days ago 0 replies      
All good points. But teach all these in one semester? Poor students..
ape4 5 days ago 0 replies      
For all assignments, tell the students to use the most appropriate language. Plot twist: all assignments are for high level applications and C isn't the most appropriate.
Amazon Reviews: How We Spot the Fakes thewirecutter.com
329 points by Osiris30  2 days ago   227 comments top 47
jonstokes 2 days ago 6 replies      
I actually know people whose job it is to write these things, and to create and maintain sock puppet personas on forums for the purpose of subtly promoting clients' products and disparaging those of competitors.

Then there are the professional reviews, which are all conflicted because most reviewers either get to keep the gear or the gear maker is a sponsor of the site/publication and the reviewer knows it. I say this as a person who's still involved in publishing reviews (of outdoor gear, mostly), and who most of the time gets to keep whatever some company sends me.

However rotten you think the adtech industry is, the state of product reviews online -- user generated, casual forum reviews, professional reviews, etc. -- is worse.

I have a standard rant about the current state of the web that I give all my family and friends, and it always ends with: "if you didn't pay to read it, you probably read an advertisement."

xiaoma 2 days ago 9 replies      
One thing I've found useful is following specific reviewers. For example, Peter Norvig reviews many, many programming and CS texts.

He has both glowing reviews such as this one: http://www.amazon.com/gp/review/R30AQNQK2I1O7P?ref_=glimp_1r...

And scathing ones such as this one: http://www.amazon.com/gp/review/R2C7L5KHUVHOR2?ref_=glimp_1r...

By following several reviewers I respect in each domain I care about, it's not hard to find quality texts (or products) that I'd never have found by typing words into Amazon's search box and looking at average review scores.

anexprogrammer 2 days ago 4 replies      
Half the time the fakes are so ludicrously easy to spot that I really wonder why they bother. Fifty glowing reviews of a book within a day of publication all raising the same 4 esoteric points in the review. No doubt 4 points suggested by the author/publisher when asking for the astroturfing.

Completely disregard all Amazon Vine reviews.

Now weight heavily in favour of reviews that have had update edits - "it failed after 2 months. The replacement failed 3 months after that" (common on all makes of $200 LED gaming keyboards) or "after a year it's still in perfect condition". Photos on the reviews are often a good sign too - but only when there's just a few - photos on reviews are rare. So lots with similar framing would be an instant strike.

Try finding a CAT6a cable on Amazon - seems they're all 4.5 - 5* wonderful, yet dig into the bad reviews a bit and most of them don't even meet wire or shield spec. Even those coming from Amazon themselves.

Aside from reviews from specific, trustworthy people, reading the mid-tier and bad reviews gives a much better flavour of what a product is about, and the odd review tearing the product to pieces can be worth its weight in gold.

If it's a marketplace seller my starting point is to mistrust all reviews and let them earn trust really slowly.

I sure won't be going near Fakespot, especially when they're so intentionally vague in describing their own product - it'll be a score to game like your Klout score and all the other meaningless metrics. The astroturfers will adjust what's needed in the fakes to pass.

kilroy123 2 days ago 3 replies      
I no longer trust Amazons reviews. They're so obviously loaded with fake reviews. I try to be smart about it and sort by most recent. I've been burned on a few products recently as well.

I met a shady guy recently, who has personally paid for tons of take reviews on kindle books. After hearing how easy it is from him, I really lost trust.

fpgaminer 2 days ago 2 replies      
One potential solution is for Amazon to do a better job of encouraging users to rate every product they buy. That would quickly drown out the fake reviews and make such tactics uneconomical.

Off the top of my head I would suggest that on the post-checkout screen they should list the customer's most recent purchases asking "Please rate your current satisfaction with your recent purchases". Just a quick 5-star rating, a small button to write a textual review if the user wants, and a note clearly stating that their review can be changed later if their opinion changes.

I suggest putting that at post-checkout because it's a point where Amazon still has the user engaged, so asking is not annoying (like their emails currently are), but the user is done shopping so they aren't preoccupied with something else. In other words, it is the least annoying time to ask. They should, of course, also have a rating box on the home page (again, a list of recent purchases, possibly muxing in older purchases at random).

And then weighting reviews is super important. Reviews given by users later in their ownership of the product should be weighted higher than reviews given just after receiving the product. Changed reviews should be weighted higher, as well as reviews on returns.

leejoramo 2 days ago 1 reply      
There are other related problems with Amazon's product pages:

Combining different items into one product page and review. For example take a look at the review for the "Dell Ultrasharp U2415 24-Inch" display. There are 6 different sized monitors on that page. Some have very different technical specs. Combining products makes sense when the difference is color preference or the size of a shirt. But the 24-inch U2415 is a very different display from the 34-Inch curved screen U3415W

Question and Answer section just before the actual reviews. I often see nonsense such as a question about a computer cable:

Q: Can I eat it?

A: I eat mine with ketchup.

ams6110 2 days ago 9 replies      
If http://fakespot.com/ can spot fake reviews, Amazon themselves should be able to do it. It's in their long term interest that product reviews are genuine so that buyers have confidence. If I can't trust Amazon reviews, I'll just shop at Target where at least I can handle the physical product before I buy it.
mmanfrin 2 days ago 0 replies      
The other major issue is in bait-and-switching products. A company can put out a great product that generates true 5-star ratings, and then swap out their product with a cheaper version but continue selling it at the 5-star-product-price.
stagger87 2 days ago 2 replies      
I love that Amazon tells me when a reviewer is part of the Vine program, so I can immediately ignore it. It's funny when a niche product has overwhelming negative reviews and a string of 5 star vine reviews.
ldpg 2 days ago 0 replies      
Here's just a few examples of listings with mostly fake reviews:



Most top listings are saturated with fake reviews. If you want to understand the depth of the fake listing problem, look at the thousands of people trying to create new listings in high-competition markets. Search Amazon for:

 "silicone oven mitts" "bamboo cutting board" "meal prep container"
Click any result, then scroll down to the "Sponsored Products" section for that item. Repeat this, really dig down into the "sponsored" lists. Notice the amount of copiers.

Now look at the reviews. For new sponsored products in these categories virtually all of the entries are from review-mill programs. There can be hundreds of fake reviews on a single listing. Many of the negative reviews you may see on these are fake as well, from competitors!

Review-mill programs work by having the reviewers buy the product through amazon (they are reimbursed or discounted). Then they do a write ups. Many are short and organic looking, the more easily spotted ones contain lots of text and multiple photos of the product.

bitL 2 days ago 4 replies      
Overall there seems to be an epidemics of "fakeness" in Internet ratings - even looking on IMDb for movie ratings I see e.g. a movie 7.7/10 but first few pages are unanimously 1-4/10 by what seem like genuine reviewers complaining about being deceived by IMDb to watch the movie...
biot 2 days ago 1 reply      
I was looking at bathroom scales recently and noticed this myself. Lots of five star reviews, many of which received the product free in exchange for a review. If you look deeper, most of these reviews are superficial. Adjust the review filter a bit and a clearer picture emerges with one and two star reviews from people who use it daily and have noticed a ton of flaws.

The product in question: https://www.amazon.com/gp/aw/reviews/B0113YL5V4/?sortBy=rece...

If you search for glass bathroom scales on Alibaba, you can find this exact model available at a price of $4/unit in volume so it's not surprising if it ends up being not the greatest quality.

thruflo22 2 days ago 2 replies      
My thought on this was to develop a generic review system (ie would work for anything with a url) that shows you subjective results based on your social graph / trust map and weeds out bad actors by crunching the advogato algorithm.

The challenge is liquidity, ie how do you seed the network and content but if you can solve that you can operate a review service for a fraction of the resources currently being wasted on these red queen races.

ck2 2 days ago 0 replies      
Another problem on amazon is allowing 3rd parties to "piggy-back" on original listings that aren't available directly from Amazon itself, and then the 3rd party will either substitute or not actually have the original item at all and ship whatever.

So the original item gets legit great reviews. But the reviewers never mention which seller they bought from. 3rd parties glam onto the high ratings and a lot of people get disappointment either with fulfillment or the product itself.

This is why I no longer buy unless it is amazon or amazon fulfilled (so I can at least return it). But you pay a premium for that.

Amazon could solve this by just noting which vendor the reviewer bought from on the subject line.

xt00 2 days ago 1 reply      
I know people that sell stuff on Amazon and they don't use fake reviews but if a competitor does use fake reviews it makes it hard not to want to make fake reviews also.. It's a vicious cycle.. Plus they get negative reviews out of nowhere.. Very likely the same competitor is paying to put negative reviews on your page.. So it's a double whammy.. Add a bunch of 1 stars to a competitor page and a bunch of 5 stars to your own page... Basically need to clean up the whole system somehow..
toephu2 2 days ago 0 replies      
Amazon needs quality control. They sell more and more crap and you have to spend so much time perusing the reviews to find out if a product is actually good or not. Sometimes I prefer to goto Costco where yes I deal with long lines but at least I know what I am buying is of decent quality, the prices are low, and they have a very generous return policy (most electronics can be returned within 90 days and anything else you can return whenever you want with or without the receipt).
okket 2 days ago 3 replies      
The problem is: Why should I write a review? I have nothing to gain.

Maybe I have had a bad experience and want to warn others: That would explain mostly bad reviews for almost every product. (Given enough users, some will have bad experiences, if only it is by them misusing it)

I can understand that mostly bad reviews creates an incentive to counter these with good reviews, but from the wrong side (seller, not consumer). This would explain the many fakes.

I guess reasonable trustworthy 3rd party sites like consumer reports are still necessary.

p4wnc6 2 days ago 0 replies      
This reminds me of a recent HN thread about Amazon Echo. I brought up my skepticism that the undisclosed but touted Amazon sales of Echo were really a signal that the device is being adopted and well-liked by the consumer market that Amazon sought to target.

And, for saying anything even slightly negative about a big tech company, I of course got downvoted to oblivion by everyone adding their own anecdata.

Most loudly, at least one commenter linked to the reviews on Amazon itself, some tens of thousands with a 4-ish star average, as if this was definitive proof of something. Even just casually looking through the reviews, it was easy to see that almost all of the glowing, positive reviews were ghost written, or came from "Top X reviewers" who clear have a selection bias for offering positive reviews to reinforce their status as a highly ranked Amazon reviewer, or from people who got advanced copies of the device (again, selection bias if they are the sort of people who actively seek out new tech to review), and on and on.

What an echo chamber this place has become (no pun intended).

magoon 2 days ago 3 replies      
Sellers also game the system by offering a refund, without requiring a return, in exchange for reverting a bad review. I've reported this to Amazon but they weren't receptive.
maaaats 2 days ago 0 replies      
I liked Amazon better before it became what Ebay became. It used to be a warehouse, now it's thousands merchants competing for exposure, thus prompting this.

When I search for a product, I now often get tens of matches, all almost identical, all having different prices and description and equally bad photographs.

PaulHoule 2 days ago 0 replies      
I am more afraid of fake bad reviews.

In the case of electronic, photography stuff and similar you find people get more consistent results with some products. For instance, charging cables. Hypothetically you could buy a $5 third party cable vs a $20 OEM part, but any thought you might have a fire leana to the $20 cable.

Nagyman 2 days ago 0 replies      
It certainly makes it more difficult to compare products. These third-party sellers aren't doing themselves any favours. I recently wanted to purchase a simple phone case (Samsung S7 Edge, so relatively new). ALL of the cases had 4-5 stars, which makes reviews useless during research mode. It was pretty easy to spot patterns in reviews that were essentially paid-for but very annoying.

In the end, I ordered a case (after checking YouTube video reviews) and was contacted by the seller to review their product on Amazon in exchange for 10% off my next purchase with the same seller. BUT not if I was going to give them less than 5 stars; in _that_ case they wanted me to contact them directly.

/end anecdote

googletazer 2 days ago 0 replies      
There are portals that connect reviewer shills with sellers who want to boost their product ranking. Google "buy amazon reviews" and you'll see it. Some do it individually, on fiverr for example.

The portals themselves are a very lucrative business, and are not going away anytime soon. There is an arms race between sellers to get that sweet sweet real estate (get their products on page 1 or page 2), and business can be a cutthroat matter. Of course, the system will eventually come to a steady state when too many buyers get burned buying crap. Review and feedback systems is a topic I'm very interested in.

tmaly 2 days ago 1 reply      
I am glad they added verified reviews that are for people that actually bought the product.

I wish they would take it one step farther and add a option to only show verified reviews and associated stats for those verified reviews

exratione 2 days ago 1 reply      
Clustered timing of reviews is unfortunately not a great metric for determining whether or not they are legitimate. Many - probably most - companies send out emails to buyers asking them to review, no compensation provided. This is pretty effective, and a big driver of legitimate reviews. The timing of those emails is determined by response rates, which tend to be better on specific days - hence the mails are batched. So for some large entities, you'll see reviews clustering on Thursday to Saturday for example.
hippich 1 day ago 0 replies      
Yes - there are fake reviews, but there are also bunch of discount-for-review. Most of these reviewers are leaving disclaimer and often leave 1-3 star review in my experience.

Now the reason for that is because it super hard to get going with your product without sending discounted samples to reviewers. Because even if you spend tons of money on ads and somehow convince customer to buy your product without reviews, that customer in 99.9% will never leave a review. And if you dare to email this customer to ask for feedback - they will curse you, complain, report to amazon.

That's where things currently are - it is two way street. Without any traffic on it you will end up with very few shitty products by big companies who will have no reason to improve quality or innovate.

Just 2c from the other side of the fence.

Godel_unicode 2 days ago 0 replies      
And now we get to have a recursive "but what's _your_ bias" discussion. For instance, would a writer for a site which posts professional reviews have a bias (conscious or otherwise) against reviews written by amateurs. Add in the "if that person was good I'd be out of work" bias and here we are.

Note that I'm talking specifically about the authors critique of the vine program, obviously the review spam is a real problem.

That aside, I do find wirecuter to be a good starting place.

kin 2 days ago 1 reply      
Yeah after being tricked by a product or two I've now developed a keen eye for products with fake reviews.

Funny story: a co-worker of mine fell for the fake-review trick and bought a product, talking about its rave reviews. I knew from looking at the reviews that they were fake and he was going to be disappointed. He said "No, look, it has great reviews". I said okay, you're going to be disappointed. And he was disappointed. It was the "Amazon featured" lock laces.

CM30 2 days ago 0 replies      
When it comes to Amazon reviews, let's not forget some of the older examples of them being gamed. Before the internet 'services' and spammers offering to make fake reviews, the same thing was being done by notorious dodgy authors and creators on a more manual level. Like Robert Stanek, a notorious self published fantasy author who posted thousands of fake reviews of his books (and probably still does):


For example, his books sold maybe ten copies a week, yet somehow got hundreds of glowing and suspiciously worded reviews on Amazon. Got to the point other sites (and Wikipedia) basically put up warning signs.

So yeah, even before the current crop of reviewer sellers started popping up, this system was being gamed by the authors and creators themselves using sockpuppets to flood it with fake reviews for their work.

jldugger 1 day ago 1 reply      
Why is the wirecutter even reading Amazon reviews? Isn't their job to give independent analysis?
guelo 2 days ago 1 reply      
If FakeSpot can detect fakes Amazon should be able to do it even better. The fact that FakeSpot even exists shows that Amazon has decided that they've won the online retail competition and there's no need to spend any more resources on improving it.
uptown 2 days ago 1 reply      
Try to buy an in-ear or forehead thermometer on Amazon based on reviews? I've found it impossible - because every one they sell seems to have its reviews tainted by people who received free merchandise in exchange for their "honest review".
VeejayRampay 2 days ago 0 replies      
The reviewing system on Amazon gives WAY too much space to the hoards of morons would give products one star because the package was damaged on arrival. As long as that's the case, the average rating won't mean anything.
Magnets 2 days ago 0 replies      
The other problem is that companies provide free or heavily discounted products in exchange for a review.

So they are using real users who then with a wink and a nudge provide 4 & 5 star reviews to keep the free products coming.

sixdimensional 2 days ago 0 replies      
Maybe if Amazon simply posted a score/rating of how many people bought vs. returned the particular product and frequency over time, that information would alone be helpful. However, not sure they would ever share such competitively valuable information directly. They do sort of share it indirectly through the best sellers lists on the Amazon site, however, they do not share details of "how" they arrive at what "best-selling" means.
chaostheory 2 days ago 1 reply      
The only thing that the author doesn't seem very familiar with is with Amazon's Vine program which isn't surprising since there aren't a lot of members.

> For example, returns and long-term use arent part of the evaluation. When you get something for free, youre less likely to follow up on breakage concerns or customer service issues. Additionally, if the reviewer didnt actually buy the product, that person doesnt take the purchase and shipping processes into consideration.

I don't feel that this is true. People who are selected for Vine aren't random Amazon customers. They are Amazon's top ranked reviewers. A lot of these reviewers have started reviewing hundreds of items before Vine or any paid reviewing programs even came into existence. i.e. They do it for the 'love' so they will discuss price and shipping considerations

Yes given Vine's 30 day review deadline, you're not going to see anything initially about long term use; but... a lot of these reviews do get updated if anything about the product changes from the original review.

> You might notice how few of the reviews through Vine and similar programs are negative or even critical. This isnt a case of reviewers intentionally being dishonest, but rather the result of unconscious positive bias.

I don't disagree here.

> Not paying for an item can make difficulties with that item seem less irritating.

Vine items are now treated as taxable income starting this year, so Vine reviewers are now paying for the items that they select.

> Additionally, reviewers may give their opinions on items for which they have no expertise or real experience and therefore have no frame of reference about how well something works by comparison.

I could be wrong but I feel that available Vine items are based on the customer's past purchase and review history. There's no guarantee that the customer is an expert, but there's some proof that the reviewer is at least familiar with the product category.

I'm sure that there are some bad Vine reviews and bad Vine reviewers. However discrediting the entire program without any data just doesn't seem fair.

(In case it wasn't already obvious, I'm part of the Vine program)

perseusprime11 2 days ago 0 replies      
This is a wide spread problem across the board not just web. Look at some of the reviews of the apps on iOS. Very detailed and makes a pitch to future users to buy this app. I have almost always never left a review unless it is a great app or if I got really mad at the app. Never felt the need to leave reviews if the app did the job it's supposed to do.
mrfusion 2 days ago 0 replies      
My idea is instead of having review why not show the percent of orders that were returned. That would give you sense of the quality.

Another idea is to randomly select customers who actually ordered the product to leave a review. Since it's random you wouldn't have the self selection problem of people getting paid to leave reviews.

B1FF_PSUVM 2 days ago 1 reply      
The sad part about all this is that the "social networks" will drive their bus through this hole.

It will be "Get recommendations from people you trust!!! (sortof)".

And, in short order, we'll be back to the same situation, only more tied-up, gagged and corrupted. ("Hey, pssst, make money fooling you friends!")

codecamper 2 days ago 0 replies      
Oh pleeeeeeze bring Fakespot to App store reviews! I'm not going to whine any more... but pleeeeeeeeze!

I've got a competitor that has made millions making fake reviews to boost up his lousy little app. He's been at it for about 5 years!

stcredzero 2 days ago 1 reply      
As always, it comes down to economics. So long as it's economical for people to run fake review companies online and for companies to buy sets of fake reviews, then it is going to happen. So how do we do that?
dredmorbius 2 days ago 0 replies      
TL;DR: Recommendations systems cannot be indifferent to truth.

This article highlights pretty much precisely the same problem that the "social media promote (bogus) conspiracy theories" item yesterday did. Slightly recycled comment.

Some related concepts:

1. Donella Meadows, in Thinking in Systems, notes that an absolute requirement of an effective and healthy system is accurate feedback and information. Media which are indifferent to truth value, or which actively promote distortion (see Robert Proctor's term, agnotology), will actively harm the system.

2. Celine's 2nd law, and inversion. In Robert Anton Wilson's Illuminatus! trilogy, a character notes that "accurate information is possible only in a non-punishing situation". Its inverse is also true: acurate information is only possible when it is accuracy itself and ONLY accuracy which is rewarded. Academic publishing, in which paper output and journal selection is a gateway determinant of professional careers, would be an instance of this. Or the long skew of The Learning Channel from NASA-PBS educational co-production to Honey Boo-Boo broadcaster.

3. Paperclip Maximizer. "Don't be evil" isn't good enough. You've got to actively seek out good. Even a benign maximisation goal will, if not tempered by requirements to provide net benefit, lead to catastrophic results.

4. Mancur Olson's "The Logic of Collective Action" explains how and why small (but motivated) fringe groups can achieve goals directly opposed to the interests of far larger groups. This explains a great deal of market and political dysfunction.

5. A generalisation of Gresham's Law leads to the realisation that understanding of complex truths is itself expensive. It's also (Dunning-Kruger) beyond the capability of much of the population. This also has some rather dismal implications, though as William Ophuls notes, political theory based on the assumption that all the children can be above average ("the Lake Wobegon effect") are doomed. You dance with the dunce what brung ya.

Amazon here will probably have to create and fund their own review staff if only to vet and validate user-supplied reviews. As one HN poster here notes, "what is my incentive to provide reviews?" A good review takes time, experience, is likely to draw flack, and, frankly, often comes from someone with better things to do with their time.

Crowdsourcing is useful where data are thin and the crowd is likely to be unbiased. Where specific expertise is required, Peter Noverg on programming texts, the Google employee who's uncovered numerous fraudulent USB-C cables, one qualified reviewer trumps millions who know nothing. (But you've got to ensure that the qualified reviewer stays both honest and engaged.)

More generally: the Amazon experience seems to be worsening. Trust is falling to eBay-like levels, if not worse. Search and discovery are poor, and product quality is all over the map. Books are one matter -- highly uniform product, simple function. Expensive kit of various description another. Among my recent purchases, an LED cabinet lighting system in which local, major chain, and Amazon sources (brick-and-morter and online) failed. Ikea turned out to have a well-thought-out product line, helpful (though ultimately not quite what I was looking for) information online, and, most importantly, an in-store display in which I could assess, mix, match, and ultimately assemble the system I'm using and enjoying now. Twice the price of what I'd initially specced out, but that system didn't work in the least.


perseusprime11 2 days ago 0 replies      
I wish there is something similar on the app store to weed out fake reviews.
tracker1 2 days ago 0 replies      
Sounds like an opportunity for a fakespot browser extension...
themartorana 2 days ago 3 replies      
You can filter Amazon reviews by "Verified Purchase" which, unless someone tells me differently, at least means the product was purchased. It's annoying but those reviews feel a bit more honest.
dang 2 days ago 3 replies      
Please don't rewrite titles unless they are misleading or linkbait: https://news.ycombinator.com/newsguidelines.html

(Submitted title was "Amazon Reviews Are Gamed by Compensated Reviews by Professional Review Companies".)

chatmasta 2 days ago 1 reply      
Amazon likes to make it look like they are "cleaning up" the fake reviews industry, by suing companies out of existence. But in reality they're just trying to drive sellers to use their own, official "paid review" product. [0]

Granted the Amazon program is more transparent (as the review is affixed with a label that says "user was compensated for this review"), but it's disingenuous at best for Amazon to act like they're taking the moral high ground by shutting down these review sites.

[0] https://www.amazon.com/gp/vine/help

Apple R&D Reveals a Pivot Is Coming aboveavalon.com
278 points by rstocker99  4 days ago   261 comments top 49
beloch 3 days ago 6 replies      
According to this chart, Apple spent $6B in 2014 and is projected to spend just over $10B in 2016.

For comparison, in 2014 Volkswagen spent $13.5B, Samsung spent $13.4B, and Microsoft spent $10.4B. [1]

Apple has, historically, not been a big spender when it comes to research, tending to favor short-term, tightly focused research over long-term, curiosity driven research. Over the last few years their research budget has ballooned, but only in 2017 (projected) are they going to reach levels that the above companies were spending at in 2014. Has Apple's research focus changed?

Perhaps we're not seeing some monumental project in the works, although an autonomous car would be pretty big. Perhaps what we're seeing is Apple deciding to loosen the purse strings and, instead of stashing away obscene piles of money, they've loosened their research focus and now have a lot of people doing whatever the hell they want, much like MS, just because something useful might someday emerge.

Let's face it, Apple has so much money they could probably launch their own space program if they really wanted to.


abalone 3 days ago 4 replies      
It's a car and it's not a "pivot". The Mac, iPod, iPhone, iPad, Apple TV and Watch are all about apply the same philosophy to new product categories and then tightly integrating them. At the core of it is fundamentally rethinking the interface between user and machine.

With cars it makes perfect sense as long as you do one thing first: throw out everything you've read about fully autonomous cars being right around the corner. If instead we're facing a future of semi-autonomous capabilities which still require a human to oversee and guide -- much like the autopilot controls of airplanes -- then there is a massive opportunity to rethink the automobile interface from the ground up around this new hybrid approach.

You can be sure that Apple's car will not just be a Jony Ive designed Tesla. It will involve a rethinking of the user interface. Apple likes to make a 10X difference when entering product categories and that's been the key element. Effective autopilot assistance features can plausibly get us to 10X improvements in safety and convenience.

Apple's functional organization is one of its secret weapons in applying this philosophy consistently across so many consumer categories and creating a halo effect. I would be very surprised if they changed that for computers-with-wheels.

paulftw 3 days ago 1 reply      
I think median Software Engineer Salary also followed hockey stick graph over last couple of years, as well as CA real estate prices, so it isn't that clear that Apple grew headcount at all.When iPhone came out there was no app store nor 3rd party apps; when apps arrived they had no push notifications; icloud didn't exist and so on. Products become exponentially more complex, requiring proportional growth in R&D. There may even be tax incentives to maximize R&D spending.

Smart home / smart appliances would be a much more logical step for Apple as it'd be much closer to their area of expertise in electronics.Unlike with iPod and iPhone, there's at least one company that is already doing all the right and hard things around cars - Tesla.

TVs, kitchen appliances, even lights are much better understood and researched. Technology is there, safety/regulatory barriers are lower than with cars, consumer demand is pretty obvious. Yet existing manufacturers releasing product after product with laughable design and usability. Just like used to be the case with Nokia and Windows smartphones.

darawk 3 days ago 7 replies      
Is this a surprise? Isn't it well known that Apple is working on a car?

I think the big question here is not are they working on a car, but can they deliver a car that represents a significant improvement over what's out there. And that i'm highly skeptical of, unless they deliver full autonomy. But there is just no way Apple is going to beat Google to market with that technology.

So, I agree they're working on a car, but I feel fairly confident that it will be an absolute disaster for them. Though most times that's been said about their products in the past decade or two has been a disaster for the person saying it.

Communitivity 3 days ago 2 replies      
I don't understand the confusion on what Apple's next big thing is, they've laid it out time and again.

Come to think of it, I think I just saw a slide with it right on the single slide during the Viv presentation. I went back and found it. Here is the video URL at the time the slide is shown: https://youtu.be/Rblb3sptgpQ?t=51

It's the same thing OpenAI, HARC, and other groups are working on: creating revolutionary intelligent assistants and possibly achieving strong AI. The chance to work on that project at Apple, that I might consider a move for.

Applied AI combined with conversational or AR interfaces to create specialized personal assistants will be the next in the line of game changers such as car, flight, radio, telephone, computer, rocket, internet, cell phone, the web, smart phone, tablets, and data mining.

baron816 3 days ago 6 replies      
I don't like the idea of Apple (or even Google for that matter) making cars. It seems like such a weird space for a company that makes consumer electronics parts to move into. It would be almost as weird as them starting to build houses. And I don't want to buy all my stuff from one company.

Either way, I'm skeptical that any company is going to make huge, monopolistic profits from selling autonomous cars. It's something that's going to easily commoditizable. Consumers aren't going to care which company their buy their autonomous tech from, as long as it's safe and it works.

A better fit (although much, much, much harder to achieve) for Apple would be a real robot/android. Apple has hundreds of billions in cash to play with and a steady stream of income on the horizon for the next few years. I think they should accept that they don't have to come out with the next big hit every year. They should look 10+ years in advance and try to beat everyone to the last consumer electronic product.

no1youknowz 3 days ago 3 replies      
According to venturebeat[0]. Samsung has 1m gear VR users. We all know how apple has a far superior supply chain, creates better software experiences and does things "right".

So is it that much of a stretch for them to actually come out with a mobile VR headset?

Think about it.

- They make the phones.

- They make the vr software.

- They already have the eco-system for you to get apps from and the developers to sell through.

- They can sell upgrades for face/hand tracking, etc.

What about a version of FaceTime, where 2 people can call each other and play a game together?

Within a very short time could exceed what Oculus is doing very quickly.

[0]: http://venturebeat.com/2016/05/11/oculus-and-samsung-have-1m...

rdl 3 days ago 2 replies      
The other depressing interpretation is that they're achieving less per dollar spent on R&D (becoming less efficient); they certainly seem to be achieving less per release of product now, so it would be consistent. Maybe there was a time when Apple got the most brilliant people wanting to work there because it was Apple; if that's no longer true, their efficiency would decrease.

(Arguably the same thing happened in K-12 education; when women were largely excluded from other professions, teaching positions were filled by the best women in the workforce; now, many of them would rather be doctors/engineers/lawyers/etc., so the quality of educators has decreased to the market level.)

bresc 3 days ago 6 replies      
I have to ask: Why cars? Why are Google, Tesla and now Apple supposedly investing in cars?

From the European point of view cars seem to be a dead end. More and more people live in an urban area and more and more young people use public transportation way more often, than cars. Additionally carsharing makes owning a car in a german city kinda obsolete.

So... why cars?

carsongross 3 days ago 1 reply      
I wish it was possible for Apple to settle in and polish, competently executing on designs that are already very good and slowly improving their entire lineup (including, please, monitors!)

It's rather depressing to consider that excellent industrial design can only be supported on the exponential-looking part of the sigmoid curve.

mangeletti 3 days ago 0 replies      
I have long thought that Apple was working in secret on a consumer robot; perhaps something on the order of $1-5k price point. It was just a hunch.

I've now begun to think that the increase in what everyone is calling "R&D" is just Apple become too large to any longer be efficient. Perhaps they've crossed that point mentioned in the Mythical Man Month where gains in team size decrease productivity.

JustSomeNobody 3 days ago 0 replies      
>After analyzing the three preceding possible explanations for Apple's R&D increase, we can conclude the only one that actually makes sense is the third choice: Apple is looking to pivot.

Can we really conclude that? Good geez.

reissbaker 3 days ago 1 reply      
Apple was once considered extremely innovative despite its secrecy because it regularly shipped ground-breaking new products like OS X, its breathtakingly high-design computers in the late 90s and early 2000s, the iPod, and of course the iPhone.

Apple hasn't shipped a successful new product line since Steve Jobs' death, and sales of its last successful major product the iPhone now seem to be slowing. Their cloud efforts are famously flailing, and their software is increasingly stale: who here has a folder full of unused Apple apps, with better third-party replacements on their home screen or dock? Almost everyone with an Apple device.

I sit here writing this on a Macbook, unironically. Their hardware manufacturing capability is best in class, and they've managed to hone their existing products' hardware increasingly close to perfection. But innovative? You can't be innovative just by spending money on R&D. You have to ship new products. And for now, Apple seems like it can't.

datashovel 3 days ago 2 replies      
I think the ideas presented appear somewhat contradictory. I may be missing something here because the following 2 ideas seem contradictory to me.

1) Apple's secrecy2) Apple's pivot

If they're pivoting, why be secret about it? One of the reasons provided was that being public about new products could hurt current product sales.

If it's an entirely new product line (ie. car), I have a hard time believing someone will think twice about buying an iPhone because Apple is coming out with an electric car.

Instead being secret seems it may hurt more than help. If I'm in the market for a new car, and there's a real release of an affordable Tesla that has already happened, while at the same time I have no idea what Apple is doing or if they're even going to do it or what kind of timeline the project is on, I'm probably going to buy the Tesla.

Otherwise if I'm somewhat certain that Apple may be coming out with a new car within a year or two, I may decide to hold off with my purchase of the Tesla for the time being in order to consider buying the Apple iCar when it comes out.

jimmytidey 3 days ago 2 replies      
Isn't it basically impossible for Apple to surprise us with a driverless car? Surely all of the process to make a driverless car legal would be in public?

Surely the production line for any kind of car at all would be almost impossible to organise covertly, all the factory space, tooling etc?

fblp 3 days ago 0 replies      
I wonder if this could also be some smart accounting. The US and other countries have tax incentives that encourage R&D expenditure.
boznz 3 days ago 1 reply      
Assuming its a car..

Cars are not phones and I don't see the massive factory/production line being prepared to build the cars nor the investment in battery production that would be required (assuming an EV not an ICE) so unless they buy BMW (LOL!) or another car manufacturer then it will be 3 years minimum from them showing a car to us being able to buy it.

aetherson 3 days ago 1 reply      
The idea that Apple is going to "pivot" is fatuous.

Are they working on some non-phone products that they hope will be big? I'm sure they are.

Could they conceivably find that one of those products achieves massive traction and overshadows the iPhone (especially as the market for smartphones cools)? Sure, though I'd bet against it actually playing out like that.

Is Apple making a big planned play to radically deemphasize smartphone sales in a desperate gamble to become a car company? Of course they aren't.

Even if the smartphone market's high-water mark was 2015, Apple has a hugely successful, almost grotesquely profitable product that will -- assuming they don't do something absurd like pivot away from it -- be the source of incredible value for at least a decade and probably much more. On the back of the iPhone, Apple has grown to a market cap that -- even after recent losses, and despite a weirdly low P/E -- is massively higher than the combined market caps of the top five auto companies.

brandonmenc 3 days ago 1 reply      
They are not working on "just a car."

They are trying to become a mass transit provider by owning and operating a large fleet of those self-driving cars, renting them out on a per-ride basis.

That is, if they're smart.

Animats 3 days ago 1 reply      
Apple may be developing automatic driving technology, but actually making cars seems unlikely. They mostly outsource manufacturing, after all.

Apple might be working on something to take a bite out of Facebook or Google Search or Amazon's Echo. The next big thing is likely to be "you just talk to it and it does the right thing". Siri with common sense.

exabrial 3 days ago 1 reply      
All I want is an Ethernet port on my MacBook Pro. Please stop the maddening war on ports.

Edit: sorry I meant Macbook pro

nickpeterson 3 days ago 0 replies      
Has anyone mentioned that maybe Apple just doesn't have the focus (Jobs) at the top telling it where to spent R&D? How much did Apple spend on R&D under other CEO's when Jobs left?
stevewilhelm 3 days ago 1 reply      
I don't think Apple would necessarily have to build their own cars from scratch or solve the alternative fuel issue or deliver a self-driving car to have a significant impact on the automobile industry.

For example, I would personally love a Subaru that had its information and entertainment systems designed and built by Apple and licensed to Subaru. It would also be cool if Apple's designers helped with the overall exterior and interior design.

woodpanel 3 days ago 0 replies      
The author is correct that Apple is haunted by the iPhone's success. My 2 cents:

1) The car business delivers lower margins than the iPhone. Even if you compare them to those of premium car companies (like Range Rover, Audi, Mercedes, BMW or Lexus).

2) The iPhone didn't had comparable brands to compete against, when it entered the phone market. It came with a hefty price tag. A sum that was never even considered mass-marketable before. The iPhone created the premium sector for phones. It raised the share of income people considered plausible for cellphones.

The author is also right in that apple is very good at applying their business model to new markets. But

3) to become premium in the car business either means

- lowering their margins

- producing cars that aim at a smaller/niche market (Porsche to Ferrari/McLaren),

- lowering the costs of the supply chain currently in use by other premium car makers.

The last one would be doable, certaintly by Apple. Certaintly with EVs. But I doubt it would be doable without any information leak other than this article.

4) More plausable for apple's margin territory would be an attempt at the user interface of cars. Like appleTv is for television sets or the healthKit-API.

The problem with cars is not the getting-from-a-to-b-part, not the status-part or the comfortable-interior-part. The problem with cars is the user interface-part, it's the outdated-touchscreens-part or the connecting-and-charging-and-holding-your-phone-part. It's the car's software that sucks. And for some that sucking software includes the driving-yourself-part. Reaching out to current premium car-makers also points into this UI/API-direction.

5) Also there are a lot of lower hanging fruits out there, easier to tackle than building a premium car. Like VR or building physical television sets, home automation, healthcare - or wait, software.

serge2k 3 days ago 1 reply      
Is the car before or after the TV they were definitely going to release a few years ago?
zargath 3 days ago 0 replies      
Apple snatched a few Tesla employees and a few people has said that Apple are working on a car. Seems obvious.

Has anybody considered that it might be other form of transportation? Maybe an electric motorcycle or some other cart? All their devices is very personal, could be a personal transportation device?

Apple's mission seems to be to create the best products they know how to build. But environmentalism is pretty highly valued at Apple, so it seems likely that they will move towards things that they think will benefit not only their costumers but also the environment.

But Tesla's mission seems to be about accelerating the adoption of electric cars. They need "everybody" to produce electric cars, so it is in Tesla's best interrest that they build a car. It will be interesting to see if Apple picks up on Tesla's R&D, with the opensource patents and maybe even a Tesla Gigafactory.

But very interesting to follow, hope they make a car and hope they make it great. But I am afraid they lost their cocky edge a few years ago, so it will probably just be a nice new car.

herghost 3 days ago 1 reply      
Why not financial services?

There's a handful of small players testing the banking regulations now (like Mondo) to get authorised as basically "iPhone banks" - modern, mobile, without the legacy and technical debt.

With Apple Pay (and their $billion reserves) Apple could conceivably sweep in and buy up these newly regulated challenger-banks and buy their way into FS overnight.

thbb 2 days ago 0 replies      
Apple certainly does well in investing in R&D and looking towards some amount of pivoting.

However, this quasi-exponential growth in spending reminds me of a pathetic moment at a EU commission meeting I attended, on research and innovation public funds in 2010.

The EU commissioner basically complained that Europe had no Unicorns like the GAFAs, but wanted to be reassuring: this is about to change. We will prevail, because we'll spend more on R&D (through tax money) than all of these do together.

With the implicit assumption that the more you spend, the bigger the returns.

Corrado 3 days ago 1 reply      
I think I would be more excited if Apple were researching ways to remove the cell phone completely, at least in a physical sense. Think about it, with the rise of things like AR/VR for output and Viv for input, what need do we have of a handheld device? Everybody is concentrating on the next iPhone, but what if it is a set of contact lenses and headphones? I think that would truly blow people's socks off.

Sure, Apple could be (and probably is) working on an automobile of some sort, but that seems like it would be difficult to really make a 10x improvement on. Especially with all the rules and regulations about what can be considered a car; it must have mirrors, steering wheel, pedals, etc. The prospect is so limiting. Besides, I think Tesla is rocking the automotive world and Apple would probably be better teaming up with them than going alone.

Just my $.02 worth.

raverbashing 3 days ago 0 replies      
No what I think is happening is this:

- Apple is investing more in ancillary services (iCloud, Siri, Maps, new Arm processors, etc)

- More investment into manufacturers to get the latest tech (retina, etc)

- Some of it is lacking focus and spending a lot of "R&D" money in irrelevant stuff (like most big companies do)

BIackSwan 3 days ago 0 replies      
Great post.

A keyword is missing - autonomous car - maybe the first couple of versions will be manual but they can be relevant in the long term only if they are working on bringing autonomous cars to the market as fast as tesla/google.

inmyunix 3 days ago 0 replies      
Car seems likely, yes, but I would place bets on an augmented reality play as well.
draw_down 4 days ago 0 replies      
Maybe not a huge mystery- they're working on a car, at the very least.
Mendenhall 3 days ago 0 replies      
Maybe apple is moving into weaponry, now that would be a pivot!
mulcahey 3 days ago 1 reply      
I really wonder how they will differentiate themselves from Tesla's offerings. Tesla represents the very cutting edge of innovation in the automotive industry being the leaders in autonomy, EVs, battery production, the chargin network, etc.

I wonder if Apple will partner with Tesla to use their Gigafactory batteries and charging network. If not they would have a lot of ground to make up.

kukabynd 3 days ago 0 replies      
Since there was talk about Apple Watch, there also was a lack of wow effect people had expected from it. Considering Apples human power, there is something going that takes time. Might be a car, might be something else. Smartphones have plateaued, so theyve been working on something greater long before growth had slowed down. Time will show.
lr 3 days ago 0 replies      
Has this car thing in relation to Apple been confirmed? Nothing about the word Titan makes me think of cars.
dbcooper 3 days ago 0 replies      
Perhaps some of this is that they are accounting more expenses as R&D for tax reasons?
Grue3 3 days ago 0 replies      
What are the margins in car industry? iPhone margins are enormous, so how do cars really compare? Is it even worth pivoting into this industry when you're already a big player in a more valuable industry?
justaman 3 days ago 0 replies      
They are working on a "next-gen" operating system. Something entirely unlike what we have come to know as a GUI.

I just thought I would leave this comment here so in a few years someone will find it.

prawn 2 days ago 0 replies      
I suspect the Titan codename is a trick and they're actually working on single-occupant vehicles.

HMD and transport are the obvious markets to tackle.

samsonradu 3 days ago 0 replies      
Until the next big thing comes, AAPL shares just hit 2year low so people might be losing their trust/patience.
mozumder 3 days ago 2 replies      
There's so much Apple could be doing that are a more natural progression from their existing product lines. They could take inspiration from some of their previous consumer efforts to target entirely new product lines, such as a camera (QuickTake), printer (LaserWriter), or game console (Pippin). A modern version of all these would sell out.

Or they could reintroduce their enterprise/business products, such as XServe.

The electric car idea really won't be ready anytime soon, due to limits of neural-net learning speed.

Inconel 3 days ago 1 reply      
Ive read a number of articles about the prospect of an Apple car but most have still left me with more questions than answers.

My main questions regarding an Apple car are the following:

1. Apple seems reluctant to chase market share at the expense of profit. The automotive industry is very well established and from my understanding operates on razor thin margins. Apple seems to be a master of the supply chain but much of this has to do with the fact that they ship such large numbers of very similar hardware. Apple doesnt ship the most smartphones in the world but they do ship the most of a single model and this allows them to put incredible pressure on their suppliers. Does anyone expect Apple to be able to move so many cars that they would be able to put more downward pressure on suppliers than the traditional automakers?

2. If the strategy isnt to dominate the market and thus assert downward pressure on suppliers then I would assume the strategy would be to sell a premium car that carries higher margins. Apple obviously has a well deserved reputation as a premium product within the computing/electronics space but is it a given that this would translate to the automotive market as well? I dont particularly care for a brands prestige but I know this does inform many consumers decision making. Apple might be viewed as a premium brand compared to Samsung/Moto/Lenovo/HP/etc, but would most consumers be willing to pay a premium for an Apple car compared to a BMW or Mercedes?

3. Is there any reason that Apples prowess in the electronics supply chain may not translate well to the automotive supply chain? These two industries seem very different to me.

4. Apple, like almost everyone else in the tech industry, uses outsourced manufacturing for the majority of their products. While there are similar contract manufacturers in automotive, they are not nearly as large nor their use as wide spread, to my knowledge at least. Foxconn was already one of the biggest electronics manufacturers even before they started building iPhones for Apple, I dont think a Foxconn equivalent exists in the automotive world. I would imagine this may present problems for someone like Apple that doesnt plan on actually building their own cars, or do they?

5. There is considerable risk when launching a car model, even for the established players, I would assume the potential risk for a newcomer would be even greater. Newly introduced cars seem to suffer from more serious and more widespread problems than newly launched smartphones/computers do, although those often suffer problems as well. Does the potential exist that problems encountered with a newly introduced Apple car may carry over to negative perceptions about the broader Apple brand? It seems enormously perilous to me to risk your brand reputation on one specific product line that traditionally tends to be both very low margin and suffer from frequent defects.

Im curious what others may think of my concerns or if anyone knows of articles that have gone into more depth with regards to my concerns.

bitmapbrother 3 days ago 2 replies      
Apple's not going to pivot. I'm sure they'd like to, but I can't really see what they would pivot too. Instead they'll just throw their hat into the usual next big thing everyone else is chasing - VR / AR and electric cars - of which I don't think they'll have a lot of success in. Their bread and butter will always be the iPhone and they'll ride that wave until it comes crashing to the shore.
davesque 3 days ago 0 replies      
Even if Apple is working on a car, I can't imagine that it will cost any less than $50,000. I've always felt the amount of money people spend on luxury vehicles is obscene.
anamoulous 3 days ago 0 replies      
Search engine.
slantaclaus 3 days ago 2 replies      
Fire Tim Cook?
heifetz 3 days ago 1 reply      
Apple needs a visionary. Google, Amazon and Facebook all have founders who are visionaries and drive the focus of the company. Tim Cook could be a great CEO, but he is not a visionary and does not drive products. Who is driving the vision at Apple? It's not Jony, I have no idea who is doing that. That is also something you can't do by committee. Apple seems to be really floundering and does not have any great visions for new products or focus to make people excited! People have certainly slowly lost interest in the iPhone. Apple needs to spend its cash and bring Elon Musk onboard!
Setting Up a Deep Learning Machine from Scratch github.com
308 points by IamFermat  1 day ago   59 comments top 13
minimaxir 1 day ago 3 replies      
The dependency hell required to run a good deep learning machine is one of the reasons why using Docker/VM is not a bad idea. Even if you follow the instructions in the OP to the letter, you can still run into issues where a) an unexpected interaction with permissions/other package versions causes the build to fail and b) building all the packages can take an hour+ to do even on a good computer.

The Neural Doodle tool (https://github.com/alexjc/neural-doodle), which appeared on HN a couple months ago (https://news.ycombinator.com/item?id=11257566), is very difficult to set up without errors. Meanwhile, the included Docker container (for the CPU implementation) can get things running immediately after a 311MB download, even on Windows which otherwise gets fussy with machine learning libraries. (I haven't played with the GPU container yet, though)

Nvidia also has an interesting implementation of Docker which allows containers to use the GPU on the host: https://github.com/NVIDIA/nvidia-docker

JackFr 1 day ago 2 replies      
Disappointed. Misread it -- I thought he was going to do deep learning withhttps://scratch.mit.edu/, not from scratch.
pilooch 1 day ago 0 replies      
Commoditizing deep learning is mandatory. After repetitive in production installs at various corps while connecting to existing pipelines, I ve convinced some of them to sponsor a commoditized open source deep learning server.

Code is here: https://github.com/beniz/deepdetect

There are differenciated CPU and GPU docker versions, and as mentioned elsewhere in this thread, they are the easiest way to setup even production system without critical impact on performances, thanks to nvidia-docker. It seems they are more popular than AMI within our little community.

mastazi 23 hours ago 4 replies      
I'm sorry if this is only tangentially on topic:

I was reading the article and got to the part related to installing CUDA drivers.

I am currently on the market for a laptop which will be used for self-learning purposes and I am interested in trying GPU-based ML solutions.

In my search for the most cost-effective machine, some of the laptops that I came across are equipped with AMD GPUs and it seems that support for them is not as good as for their Nvidia counterparts: so far I know of Theano and Caffe supporting OpenCL and I know support might come in the future from TensorFlow [1], in addition I saw that there are solutions for Torch [2] although they seem to be developed by single individuals.

I was wondering if someone with experience in ML could give me some advice: is the AMD route viable?

[1] https://github.com/tensorflow/tensorflow/issues/22

[2] https://github.com/torch/torch7/wiki/Cheatsheet#opencl

zacharyfmarion 1 day ago 0 replies      
I posted something similar on my blog (http://zacharyfmarion.io/machine-learning-with-amazon-ec2/) not too long ago. Would be nice if there was a tool that set all of this up for you!
vonnik 1 day ago 1 reply      
I work on Deeplearning4j, and I'm told that the install process is not too hellish. Feedback welcome there:



Someone in the community also Dockerized Spark + Hadoop + OpenBlas:


The GPU release is coming out Monday.

profen 1 day ago 1 reply      
The steps are pretty neat. Also agree on the driver and tools installation. Just painful and long.

looks like there are seperate torch and caffe amis as well for amazon. Going to try later.



visarga 1 day ago 2 replies      
Is there a host offering GPU systems preconfigured with ML frameworks and models, for playing around? Something simple to use like Digital Ocean.
amelius 1 day ago 2 replies      
Step 1: make sure that your machine has sufficient free PSI slots for the GPU cards, and that you have sufficient physical space inside the machine.

Seriously... why can't there be a better way of adding coprocessors to a machine? Like stacking some boxes, interconnected by parallel ribbon cable, or something like that?

profen 1 day ago 0 replies      
Have used this digits ami on aws in the past for caffe and torch.


tzz 1 day ago 2 replies      
If someone creates a Juju Charm https://jujucharms.com for this, then you can use the pre-configured service on any of the major public clouds.
tacos 1 day ago 1 reply      
I don't understand the fascination with these "make a list" style setup instructions, as they're almost immediately outdated, and seldom updated.

We have AMI, we have docker, we have (gasp) shell scripts. It's 2016, Why am I cutting and pasting between a web page and a console?

To my knowledge the only thing that does something like this well is oh-my-zsh. And look at the success they've had! So either do it right, or don't do it at all.

raverbashing 1 day ago 4 replies      
No, you don't need to restart your machine after you install CUDA.

Also you might not need to restart after you install the drivers, this is not Windows. (But there might be some rmmod/modprobe needed)

> If your deep learning machine is not your primary work desktop, it helps to be able to access it remotely

Yes, use ssh.

HARC ycombinator.com
414 points by sama  4 days ago   100 comments top 35
sgentle 4 days ago 2 replies      
So, if I understand this correctly, it's the latest in a series of attempts by Alan Kay to find the right home for his vision of a new Xerox PARC. That began with VPRI (NSF funded), then CDG (SAP funded), and now HARC (YCR funded).

As I understand it, VPRI wrapped up because they ran out of money. I wonder what caused the move away from CDG?

Regardless, I hope he succeeds before we run out of four-letter acronyms funded by three-letter acronyms.

(Edit: CPRG -> CDG, my bad)

ozten 4 days ago 1 reply      
Last week VPRI[1] published the final "STEPS Toward the Reinvention of Programming" paper[2].

Although it is 2012, it had been unavailable publicly until now. A great way to catch up on how FONC ended. It was worked on by many individuals being cited as part of HARC.

[1] http://vpri.org/html/writings.php[2] http://www.vpri.org/pdf/tr2012001_steps.pdf

jwise0 4 days ago 4 replies      
On the one hand, I'm happy to see visionaries like Vi Hart and Bret Victor (and presumably the others who I don't know of their work, but can only imagine to be quite good) supported to do the work that they do.

On the other hand, I am a little concerned to see InfoSys joining the fray here. InfoSys, to my understanding, are basically the face of H-1B abuse. So I'm happy to see these people funded, but it's a little harder for me to cheer when the funding comes from such a questionable source...

tylercubell 4 days ago 8 replies      
As a layperson, I have almost no idea what any of this means.

> HARCs mission is to ensure human wisdom exceeds human power, by inventing and freely sharing ideas and technology that allow all humans to see further and understand more deeply.

Isn't this what the internet is for? What's new?

> Our shared vision of technology combines an expansive long-term view with a strong moral sense. We look to the distant past as well as the far future. We reject the artificial boundaries created between the humanities, arts, and sciences. We dont always agree on what is good or evil, right or wrong, but we use these words seriously and are driven by them.

This is so vague I have no idea how you can attach any meaning to it.

> We are focusing on areas where we believe the structures created today will have the most impact on the future, and that can most benefit from having dedicated resources outside the for-profit world. At the moment, these areas include programming languages, interfaces, education, and virtual reality.

So you're gathering a group of smart people together to do non-profit research in a few select fields with the goal of improving humanity?

amasad 4 days ago 6 replies      
I'm excited to see what comes out of this. The area I'm hoping they'd look at is programming. There hasn't been any big ideas in programming in a really long time. Languages are rehashes of the same features with slightly different configurations and incremental improvement in performance or tooling. Programming interfaces are stagnant too -- Smalltalk's interactive devlopement environment is still the sci-fi version of what most people think programming could be.

The programmer and the program have never been further away from each other as it is today. The development environment have never been further away from the production environment as it is today. And end-user interfaces have never been as abstracted away from computing as it is today.

If any group can breathe some life into this stagnant area then my bet would be on this group.

davidw 4 days ago 1 reply      
Alan Kay? Ok, I'm impressed!

It does sound kind of vague. I hope it's not so open ended that it just means kind of noodling around with interesting stuff. But hey, Alan Kay's noodling would still be good, I bet. Interested to see what comes out of it!

cpr 4 days ago 1 reply      
Most of these luminaries worked at SAP until recently.

Are they being spun out into HARC, or are they just collaborating from inside SAP's blue-sky research group?

It would be helpful to clarify what their institutional standings are, if we're to understand the full import of this announcement.

nkoren 4 days ago 2 replies      
May I make a human advancement suggestion which has very little to do with technology?

Teach about cognitive bias. Tech it at a fundamental level -- like, starting in 1st grade, with annual refreshers thereafter. Give cognitive bias an equal place at the table alongside reading, writing, and arithmetic.

Many cognitive biases are deeply rooted in biological and social structures -- but with awareness and training, they can be overcome. Without awareness and training, they can be profoundly destructive, and certainly limit the scope of human advancement.

So before we start pimping out our metacortexes or whatever, let's see if we can't overcome some of the less salutary whisperings of our old and honestly pretty useless reptilian brains. We can be better than that.

apsec112 4 days ago 4 replies      
This isn't a judgment of the project itself, but the announcement's wording reminds me of the marketing newspeak from "How to Apply to Y Combinator":

"The best answers are the most matter of fact. Its a mistake to use marketing-speak to make your idea sound more exciting. Were immune to marketing-speak; to us its just noise. 1. So dont begin your answer with something like

We are going to transform the relationship between individuals and information.

That sounds impressive, but it conveys nothing. It could be a description of any technology company. Are you going to build a search engine? Database software? A router? I have no idea.

One test of whether youre explaining your idea effectively is to ask how close the reader is to reproducing it. After reading that sentence Im no closer than I was before, so its content is effectively zero. Another mistake is to begin with a sweeping introductory paragraph about how important the problem is:

Information is the lifeblood of the modern organization. The ability to channel information quickly and efficiently to those who need it is critical to a companys success. A company that achieves an edge in the efficient use of information will, all other things being equal, have a significant edge over competitors.

Again, zero content; after reading this, the reader is no closer to reproducing your project than before. A good answer would be something like:

A database with a wiki-like interface, combined with a graphical UI for controlling who can see and edit what.

Im not convinced yet that this will be the next Google, but at least Im starting to engage with it. Im thinking what such a thing would be like.

One reason founders resist giving matter-of-fact descriptions is that they seem to constrain your potential. But its so much more than a database with a wiki UI! The problem is, the less constraining your description, the less youre saying. So its better to err on the side of matter-of-factness." https://www.ycombinator.com/howtoapply/

imh 4 days ago 0 replies      
>HARC researches technology in its broadest context, which includes: technology for communication (from the invention of spoken language to modern data graphics), intellectual tools (such as the scientific method and computer simulation), media (from cave painting to video games), and social systems (including democracy and public education).

In that broadest context, it really seems to lose its meaning. I think a closer word to that description is "ideas." Why try to cram the word "technology" into it, when it's such a strech?

robot 3 days ago 0 replies      
I didn't understand anything from this post.
karmicthreat 4 days ago 2 replies      
YC seems to be trying to reach into a number of different areas lately. Maybe they find that they don't have enough innovation walking through their door anymore. Or that they need to go the extra mile to push things forward themselves.
benatkin 3 days ago 0 replies      
The description of what they hope to achieve is vague, in a good way. I'm sure a lot of people from the rationalist community in the Bay Area, who already admire YC, will love this.
vbit 4 days ago 2 replies      
Very exciting! How can I join? :D
rjurney 4 days ago 0 replies      
Very impressed they hired Bret Victor. His brilliance outpaces anyone else in his field, and he communicates well.
arca_vorago 4 days ago 0 replies      
I hope they realize that copyleft and protection of users is a prereq for this bit, "to ensure human wisdom exceeds human power, by inventing and freely sharing ideas".

I am waiting for the days when some big names wake up and say "RMS has been right all along, we need to copyleft everything asap!"

I'm getting tired of hearing about $NEXTGREATTECH and then finding out it's a SaaS or proprietary etc.

Heres to hoping this group does something productive and free as in beer and speech.

aroman 4 days ago 0 replies      
Very interesting, and very exciting!

How will this relate to VPRI?

And are these researchers joining full-time and collaborating together directly, or are they just "part of the project"?

miguelrochefort 3 days ago 0 replies      
I can say without the shadow of a doubt that HARC is working on a new language. Not your average spoken or programming language, but a general-purpose and interactive language. A language that can only be communicated through a computer.

Text and speech are inherently limited by their linear and one-dimensional nature. Graphics are much more powerful, and leverage high-bandwidth senses. Knowledge will be consumed by navigating knowledge hypergraphs and causational trees (how-why axis).

In English, you write everything from scratch. You start with a blank page, then a word, then a sentence. With this new language, you must start with something that exists, some kind of node/edge of the graph. Communication is mostly done by manipulating the knowledge graph. This means that you can't say something that has already been said before. You don't need to provide or explain context. You can instantly see the impact of your thoughts. You can't say incoherent or fallacious things. In most cases, you don't have to say a single thing, as you can find it has been said before. Most decent programming environment provide libraries/modules, auto-completion, and compilation/execution. This new platform brings all these things to communication.

We're talking about an universal language here. An homoiconic language, where the distinction between code, data and UI disappears.

This is nothing new. People have been talking about this for decades. The most important challenge here is not technical, but social. It is becoming clear that the application paradigm is not sustainable and cannot scale. Personal assistants (Siri, Viv, Cortana, Google Now) and Messaging/Bots (Magic, WeChat, Slack, Microsoft Bots Framework, Facebook Bots Platform) are clear symptoms. Although I've been trying to tell people about this future for more than a decade now, I don't have the resources of track-record to be heard. I am hopeful that HARC will change that.

krilnon 4 days ago 0 replies      
I'm excited to see that Alex Warth is involved with this. In my late undergrad days, I was really excited by his OMeta language. It turned out that I'm actually really bad at writing PEGs (or even ANTLR grammars), so I wasn't the best candidate for advancing OMeta into common use, but the idea is really compelling.
fitzwatermellow 3 days ago 0 replies      
I try to make a private study of the global survey of Wisdom Literature in an attempt to distil the essence of what it basically means to live a good life and be a moral person. Everything from Confucius to Aurelius, from Goethe to Gogol. And on and on. Wouldn't it be prudent, whilst you have such an assemblage of noble thinkers, to compile some sort of universal knowledge base of choice epigrams. For the purpose of henceforth explicitly delineating what it means to be "beneficial" and "just" for all future readers to come?

In any case, this looks like a necessary and timely line of inquiry and am looking forward to the fruits of these endeavours. Good luck!

micheljansen 3 days ago 0 replies      
I hope they just give these guys a bag of money and leave them alone to do their thing. They can then connect their work to other guys with a commercial focus where appropriate, without interrupting the flow.
panic 4 days ago 0 replies      
I'm excited to see where this goes! I'm curious, though: Alan and gang were already working on many of these ideas under the aegis of SAP's Communications Design Group. What was the motivation behind starting a new project?
iMuzz 4 days ago 2 replies      
Can someone explain to me what their goal is?
lez 4 days ago 0 replies      
I especially like the term because it means 'battle' in Hungarian.
pcvarmint 3 days ago 0 replies      
Not to be confused with another HARC: http://www.harcresearch.org/about/75
roansh 3 days ago 0 replies      
There is a HARC in Amy Tintera's novel Reboot with similar goals. I wonder if they named it HARC because of the novel, or if it's a coincidence.
phodo 4 days ago 0 replies      
First SLAC (homebrew computer club fame more than the linear accelerator part) and PARC (so many innovations... where to start... right, ask Alan!)... and now HARC.

In good company!

wiz21c 3 days ago 0 replies      
>> Our shared vision of technology combines an expansive long-term view with a strong moral sense.>> > HARCs mission is to ensure human wisdom exceeds human power, by inventing and freely sharing ideas and technology

Free software ?

Ahhh nooo, my mistake, YCombinator is funding the project...

DodgyEggplant 4 days ago 1 reply      
This is great of course, but then again - it feels like human is the only species on this planet. We really have to start to think as an echo system, especially with technology.
edem 3 days ago 0 replies      
This somehow reminds me of the Encyclopedia Galactica from the Foundation series.
masterponomo 3 days ago 0 replies      
I hope they have an airtight written partnership agreement on this thang.
auggierose 4 days ago 0 replies      
Very exciting. I wished there would also be room for a group at YCR/HARC working on collaborative theorem proving (CTP) as proposed here: http://arxiv.org/abs/1404.6186 :-)
tempodox 3 days ago 0 replies      
Equating invention of future computing technologies with human advancement is too pompous for my taste. I won't hold my breath.
nezuvian 3 days ago 0 replies      
off topic, but FYI HARC in hungarian means fight :)
gnarbarian 3 days ago 2 replies      
I don't know about this. The morality stuff sounds like a cult and the rest is so damn vague this whole thing feels half architecture astronaut and half bullshit artist.

But since they are YC funded go ahead and downvote me dead for heresy.


Just the result I expected from you asskissers.

I must, sadly, withdraw my endorsement of Yubikey 4 devices plus.google.com
354 points by v4n4d1s  2 days ago   109 comments top 17
OJFord 2 days ago 3 replies      
From the Github issue [0]:

 > Further hostility against the company or our users will > not be tolerated in this forum, and will be met with > bans.
Odd reaction. Especially when they've _changed_ from open to closed source, and what benefit is there, really, to a closed-source 'OpenPGP' implementation?

They're looking for a profit, sure, but they're blessed to be a hardware company. It's not like I can just clone they're repo and not need to buy their product.

[0] - https://github.com/Yubico/ykneo-openpgp/issues/2#issuecommen...

methou 2 days ago 0 replies      
I was seeking an alternative and cheaper OpenPGP solution to Yubikeys, then I found that the OpenPGP card is essentially a Java applet lives on a chip runs JVM, and JVM runs on top of JavaCard OS. Since all the programs follows GlobalPlatform standards, communication with Java Cards can be straightforward.

In the end, it's not difficult to burn opensource openPGP applet to your own card. But there are 2 problems:

1. Bulk sales. If you want to all the things by yourself, and you found an ideal chip (recent NXP SmartMX2 cards has all the fancy stuff you want), almost every reseller only allow bulk purchases, say 100 pcs minimum.

2. Propriety software. For NXP cards, you need a propriety software to initialize/unlock a card before you can use GlobalPlatform tools to flash your own Applets. A reseller told me that his can be done by sending raw HEX code with a Transport Key to workaround, but I'm not sure about it.

mgrennan 2 days ago 3 replies      
I've been disappointed in Yubico since I reported a Replay Attack, in their server, to them and Steve Gibson a couple of years ago. They gave now reply. Steve replied after a called him out publicly. I'm considering creating a like process based on the USB Rubber Ducky. I'm thinking simple one time pad.https://hakshop.myshopify.com/products/usb-rubber-ducky-delu...1s
nickysielicki 2 days ago 4 replies      
I've been looking into purchasing an OpenPGP card/stick for a while. Haven't yet pulled the plug.

Here are some fully open Yubikey alternatives.




geofft 2 days ago 1 reply      
This is about the code running on the YubiKey itself, not about the code to interact with it from a general-purpose computer?

And if I'm reading the linked GitHub issue correctly, this is about a specific plugin that runs in a sandbox on the YubiKey NEO, where the main codebase of the NEO is still proprietary?

I don't understand the advantage of it being open-source then, at least as far as security goes. (For user freedoms in practice, maybe.) What guarantee do you have that the code on the device matches the code on GitHub, or that the code on GitHub isn't subverted by other code on the device?

awinter-py 2 days ago 0 replies      
whatever the conclusion here I'm very glad there are eyes on these devices.

Is there a central clearinghouse for security audits of hardware / software? This is something the FOSS community can do much better than msft or even open source promoters like fb/goog, but not if the results are distributed on the experts' blogs and tumblrs.

tmikaeld 2 days ago 3 replies      
Ah crap, why ruin something great just out of greed (What other reason could there be?) :-(
beezle 2 days ago 3 replies      
The problem with both Yubi and Nitro is that pin entry is by keyboard, not a secure pinpad.
parent5446 2 days ago 0 replies      
The one thing that I find missing in Nitrokey is that none of their regular keys support U2F alongside other 2FA methods, like Yubikey does. You need the separate U2F device for that, and I don't want to carry around multiple tokens if at all possible.
jc4p 2 days ago 2 replies      
While on the subject, does anyone know how to actually put a 4096 bit key on a Yubikey 4? I've been trying for months and their support is non-existent.
dopkew 1 day ago 0 replies      
I'm glad someone's using 'libre'; glad that i can easily refer to liberated open-source software without ambiguity.
fapjacks 2 days ago 0 replies      
Locked to contributors. Surprise!
jbaviat 2 days ago 0 replies      
Glad to learn Nitrokey has ECC support, even if only 256 bits.
chinathrow 2 days ago 0 replies      
They arw pulling a MakerBot.
Dowwie 1 day ago 0 replies      
Hang on just a minute, hackernewsies. Put down your pitchforks and torches.

Do you really expect a leading company of security hardware to give the keys of its kingdom away (pun intended)?

e12e 1 day ago 1 reply      
I don't really see what's new here, that made the author "withdraw his endorsement". It's an issue from 2014, about a device that has always been fully proprietary? Ok, so they make other devices that was in some small way open, and ran Free software. Great. But the yubikey devices have never AFAIK really been open in any meaningful sense. So, really this isn't so much yubikey changing what they do, but rather the author not understanding what these devices were in the first place?

As far as I can tell, if you got one of these in the mail, there'd be no meaningful way you could verify that it hadn't been tampered with anyway. So you'd just have to make a leap of faith, and assume it was "secure"? If you were prepared to do that, then fine use the yubikeys. If not, perhaps you should take a deeper look at your usb mouse and keyboard too. Did you verify that your keyboard isn't running some code that might compromise your security?

fred_is_fred 2 days ago 4 replies      
I guess I should know this guy, but I don't. When I see the picture and a post on Google+ it hardly seems like something that I should take seriously. I know the fake mustache is there to show what a fun guy this is, but if you're posting something you want people to take seriously, post it seriously.
New Sublime Text update sublimetext.com
335 points by pyed  3 days ago   223 comments top 23
Overtonwindow 2 days ago 2 replies      
I'm not a programmer, please forgive me, but I love using sublime as of writing tool for legislation and public policy in my work. The color coding system popular in programming, has been invaluable in drafting legislation.
micnguyen 2 days ago 5 replies      
The company behind Sublime really fascinates me.

Given how many developers I've seen use Sublime, in the modern age of social media I'm so surprised SublimeHQ is still invisible. They hardly do any marketing that I've seen online, no social media engagement, nothing. Not necessarily a bad thing, but Sublime just seemed -primed- to be that sort of company with a hyper-engaged user base.

keithnz 2 days ago 5 replies      
loved my time with sublime, then atom crossed a line where, while not as good as sublime in some area, it got good in other areas... then... vscode came along and showed how snappy an electron app could be, but didn't have a lot of stuff.... then bam, it opened up, and still not as many toys as Atom, and not quite as slick as sublime, it hits a bit of sweet spot somewhere in between.
donatj 2 days ago 1 reply      
I forget that not everyone is in the dev channel and was wondering why this was on the front page. There's new dev's every couple days. Didn't realize there haven't been a stable in a long time.
ihsw 2 days ago 1 reply      
> Themes may now be switched on the fly without artifacts

Excellent news.

Switching themes and seeing the UI barf is unsightly.

bsclifton 2 days ago 0 replies      
If the author of Sublime is reading this thread, please know that I love your work! I happily paid the $70 and use this as my primary editor (except when using vim when working over SSH).

I'd also like to see more activity on the Twitter account or just more engagement with the community. You've got a killer product :)

baldfat 2 days ago 2 replies      
RANT: "Ubuntu 64 bit - also available as a tarball for other Linux distributions."

Linux requires THREE files (.deb, .rpm and a .tar) I personally use OpenSUSE and can easily compile the software BUT you are not "supporting" Linux when you only support Ubuntu.

PhasmaFelis 2 days ago 4 replies      
I was trying to choose a Mac text editor recently; got it down to Atom and Sublime, and then discovered that both of them do code folding wrong. They think it's based on indentation, not syntax, so if you try to fold something like:

 if (something) { some code //commented-out code more code }
It folds everything from the opening bracket to the comment, then stops.

Both editors have had issues filed over this bug for years, which have been ignored. (https://github.com/atom/atom/issues/3442, https://github.com/SublimeTextIssues/Core/issues/101)

I eventually decided to go with Atom; it's open-source, so I can at least aspire to fix the damn thing myself when I get annoyed enough.

bsbechtel 2 days ago 1 reply      
I recently updated from Sublime Text 2 to ST3. One thing I loved about version 2 was that it was insanely fast. Version 3 is significantly slower. Can anyone point me to any resources that might help me determine why the slowdown?
wiesson 2 days ago 2 replies      
Why is it so much faster than atom?
marcosscriven 2 days ago 1 reply      
Regarding the comparisons with Atom, can I take the opportunity to ask if anyone from ST could expand on their answer here[0] as how they go about making a custom OpenGL interface?

Do they use GLEW? Skia? How do they ensure anti-aliasing is done right?

[0] https://news.ycombinator.com/item?id=2822114

bsimpson 2 days ago 1 reply      
In case anyone was wondering, the new JavaScript syntax highlighter appears to understand ES6, but chokes on JSX. If you're writing React, the Babel package is still a good idea.
oneeyedpigeon 2 days ago 5 replies      
Is there a roadmap anywhere? I bought ST2, and I think I'm going to need to pay again for ST3 when I decide to upgrade, but it's still in beta and has been for over 3 years now. Would be nice to know when it will eventually be released.
nikolay 2 days ago 1 reply      
I wonder what kind of release process this guy has as the dev version always lags behind stable - https://www.sublimetext.com/3dev
wrcwill 2 days ago 1 reply      
Anyone know why this update changed highlighting for me?

Tomorrow Night Scheme, strings in double quotes are now gray instead of green?..

bobsoap 2 days ago 0 replies      
PSA: If you haven't updated yet, I'd wait. The new version apparently broke syntax highlighting for many languages. Check out their support forum[1] - it's causing ripples.

[1] https://forum.sublimetext.com/c/technical-support

m_mueller 2 days ago 3 replies      
Is it just me or did this break the python syntax highlighter for line continuations within strings? I had to install MagicPython to fix it.
thincan11 2 days ago 1 reply      
Is sublime working on something big?
Fizzadar 2 days ago 1 reply      
:( Python docstrings are no longer coloured as strings, but comments...
vyacheslavl 2 days ago 0 replies      
ST now supports async/await keywords in python! yay!
zyxley 2 days ago 5 replies      
It's good to see more ST updates, but it's sad that it feels like they're only even bothering because of the popularity of Atom.
wnevets 2 days ago 0 replies      
the 3.0 dev channel has been fairly active, 13 updates this year.
_RPM 2 days ago 3 replies      
The only proprietary software that I use as a matter of choice is Microsoft products and VMWare products, For something as trivial as a text editor, I wouldn't dare use something closed source, proprietary like this.
       cached 16 May 2016 02:11:01 GMT