hacker news with inline top comments    .. more ..    25 Mar 2016 News
home   ask   best   3 years ago   
1
Privacy Forget Your Credit Card privacy.com
256 points by doomrobo  3 hours ago   191 comments top 52
1
soneca 1 hour ago 8 replies      
Is that something that new? My bank (Ita, in Brazil) offers this option for some time now.

Here(in portuguese): https://www.itau.com.br/cartoes/cartao-virtual/

Or am I missing something?

Edit: They launched it in 2002: http://exame2.com.br/mobile/tecnologia/noticias/itau-agora-t...

Edit2: Sounds new in the US. This is not supposed to be a bragging/snarky comment. Just genuinely surprised as innovation usually come the other way around, from US to Brazil. So Congrats on the launch! Good job, sounds tough to launch it not being a Bank!

2
ac29 1 hour ago 1 reply      
In case anyone didn't catch what this actually costs, the answer is: 1.5-2%, which is the rate you could get cash back (or airline miles/etc) with good credit.

Because this service draws directly from your bank account, and takes what would otherwise be your rewards from the credit card fees their banking partners charge, it provides a nice business model for them at the cost of you getting 0% rewards back. Not worth it, in my opinion.

3
boling11 3 hours ago 16 replies      
Hey HN - Privacy.com co-founder here. I'm really excited to share what we've been working on for the past year and a half or so.

We've been neck-deep in payments stuff on the card issuing side (getting a BIN sponsor, ACH origination, etc), so happy to answer any questions on that front as well.

P.S. For new users, your first $5 donation to watsi.org is on us :)

4
tedmiston 3 hours ago 2 replies      
My biggest question with Privacy, and of any one-time use credit card numbers service, is always:

Will it affect my rewards? Will businesses still show up unaffected with the same categories on my credit card statement? (I have a travel rewards only card, so breaking the rewards flow is a deal-breaker for using a higher level service.)

Edit: I misunderstood the service as being able to be layered on top of normal credit cards. It looks like the funding source is only bank accounts for now. Still my question remains if building on credit or debit cards is on the roadmap.

Edit 2: They are one-time use numbers, right? "Use at merchants" (plural) seems to possibly imply otherwise.

> What happens when I generate a new Privacy card?

> We'll give you a random 16-digit Visa card number that you can use at merchants that accept Visa debit cards...

Edit 3: It sounds like the business model results in keeping the money that would go to rewards on a normal card.

> How do you make money?

> Every time you spend using a Privacy card, the merchant or website pays a fee (called interchange) to Visa and the issuing bank. This fee is shared with us. We have some premium features planned, but rest assured, our core virtual card product will always be free and we will never sell your personal data.

5
drglitch 1 hour ago 5 replies      
Both citi and bankofamerica (and I believe so, but didn't personally use, Wells Fargo) offered this service for free on their CC accounts in mid to late 2000s.

You could set limits per number, have it lock to just single merchant, etc. pretty nifty when paying some wacky merchant online.

All have since shuttered the service because pretty much every CC comes with purchase protection that you can invoke to charge the vendor back in case of something going wrong.

Virtual CCs provide very limited utility in my mind - because the place you're likely to have your CC swiped - a bar or a cab - are still going to use only the legacy plastic version.

6
jjallen 1 hour ago 1 reply      
Wish they explained this better:

"Please ensure this information is accurate. We'rerequired to verify this information against publicrecords. But don't worry, we'll keep it private."

I suppose I'm legally opening a bank account, which has similar requested info as this, but are they checking my credit (probably not, I know, but it makes me uncomfortable)? Will wait a while.

7
electic 1 hour ago 3 replies      
I signed up for this. Sadly, it is not what I thought it was and the website does not make it very clear. Basically, this is for online purchases only. To make matters a bit worse, it wants to connect to your real bank account.

What we need here is a physical credit card that I can use in the real-world that has a new number on each swipe. Most of my historical fraud has happened because I probably swiped my card at a location that was compromised.

Just my two cents.

8
habosa 2 hours ago 2 replies      
This is one of those things I have wanted to make so many times and I assumed it would either be technically impossible (card numbers not actually a huge number space) or it would just get marked as fraud.

Excited to see someone giving it a try.

9
mirimir 3 hours ago 1 reply      
It's an interesting idea. However, I'm not comfortable with a third party having all that information. Some banks issue "corporate" cards, with numerous "employee" cards. I already trust the bank, after all. So what else does Privacy.com provide that's worth the risk? They're still subject to KYC, right? So there's no strong privacy. Or am I missing something?
10
nommm-nommm 2 hours ago 1 reply      
So what happens when I have to return something and they put the money back on the card I used to purchase it?
11
mkhalil 1 hour ago 1 reply      
I'm in love. Seriously, been waiting for this for soooo long. And the fact that the website supports two factor auth + is SUPER easy to use makes this a double whammy!!! :)

I've been a customer for about 5 minutes, have used it twice, and am already going to recommend it.

edit: I'm quite aware that this has been possible, but both banks/credit cards that I have make me jump through tons of ugly UI and clicks to make it happen.

12
nommm-nommm 2 hours ago 1 reply      
"Never forget the cancel one of those pesky 30 day free trials."

This is very misleading to say the least. Not paying for a service doesn't cancel a service. If they tried to bill your card and the card was rejected that doesn't mean the service is cancelled.

13
orf 2 hours ago 2 replies      
> Privacy is PCI-DSS compliant. We are held to the same rigorous security standards as your bank.

I always giggle when I see that.

14
cemregr 1 hour ago 1 reply      
The email you send to verify the bank comes off as SUPER shady. It reads exactly like a phishing email. It doesn't talk about which site / bank I'm using. Might be worth fixing.

From:Account Management Team <account.management@acctmanagement.com>

....

Thank you for being a valued customer.

Sincerely,Online Banking Team

15
__d 3 hours ago 5 replies      
I understand why you need it, and I want this service in a big way, but I'm just baulking at giving you my online banking username and password. Why should I trust you with that?
16
gwintrob 3 hours ago 0 replies      
Great company name. How'd you get the domain?
17
drglitch 1 hour ago 0 replies      
Quick question to founder lurking here - if you're advertising yourself as a credit card and yet you do not extend credit (and use bank account as funding source) aren't you misadvetising? If it's just a virtual debit card, you are likely providing far less protection to consumer than a credit card would.
18
mindslight 2 hours ago 1 reply      
I like this, especially the repudiating of the privacy-hostile billing name/address voodoo. But I'd worry about forgoing the traditional protection of credit card chargebacks, and having to rely on debit card terms and direct ACH.
19
fuzzywalrus 1 hour ago 0 replies      
I'm not sure if I'm ready to hand over personal details to Privacy, there's not much assurance other than "We'll never sell your data to anyone".

Does privacy.com see where I make all my purchases? Is there a collection of my metadata? What assurances do I have that you take personal privacy seriously?

20
elchief 1 hour ago 0 replies      
Looks cool.

Supports TOTP 2FA, HSTS, nosniff, CSP, x-frame-options, xss-protection

A+ ssllabs rating

A securityheaders rating

Some issues:

Some user enumeration issues. I emailed security@privacy.com but it doesn't exist...resent to questions@

I don't like how they ask for your bank's login username and password. I don't feel comfortable giving them that. There must be another way.

Should confirm email address before you can login

21
rgbrgb 47 minutes ago 0 replies      
Is this Final without a physical card?

https://getfinal.com/

22
iamleppert 51 minutes ago 0 replies      
It looks like funding is done via ACH. Does your business operate a credit operation as well to handle the risk of spending money and unable to complete the ACH transaction?

I've always wondered about the business side of that...where does the money come from, how is individual debt handled. Do you operate collections? How do you do this without requiring a credit check? etc..

23
cordite 3 hours ago 2 replies      
The stop subscriptions aspect really stood out to me, I had to spend 40 minutes on the phone with that darn company to get things canceled, even though I only used it for one day for an hour.
24
jcrawfordor 2 hours ago 1 reply      
I accept that disabling JavaScript is generally a losing battle, but it specifically irks me when the website of a privacy-centric service is just completely blank if you don't have JavaScript enabled. Of all 30 people out there browsing without JavaScript, it seems like they have an elevated chance of all wanting to learn about this service, and I find myself moderately discouraged from trying it by this issue.
25
pavs 1 hour ago 0 replies      
I use netteller, that does something similar, called virtual cards. Can create multiple cards and assign funds to each virtual card. Its not as smoothly done as this one, but same thing.
26
hdjeieejdj 11 minutes ago 0 replies      
the issues I have with this are:

1) only for online purchases and limited use case- how many times do I make a purchase online that's not on Amazon, or where I'm not using PayPal?

2) new chip cards already do this for in store purchases

3) loss of travel/reward points

27
r1ch 3 hours ago 1 reply      
Any way this works without a browser extension? I'm assuming such an extension has full access to every single page in order to do its job, which is a huge security risk. You don't need to be reading my emails or passwords.
28
avar 1 hour ago 1 reply      
I've been curious as to why the following strategy wouldn't work as a hack as well:

* Your credit card has a balance of $0 on it

* You have some app that allows $NAME to deduct $X from it

* You transfer $X to it earmarked for $NAME for some limited amount of time.

I.e. you could walk into Starbucks, have an app on your phone to say you're depositing $20 into an account earmarked for /starbucks/i for 30 minutes.

29
prohor 2 hours ago 1 reply      
Does it work if I live outside US?
30
mfkp 3 hours ago 2 replies      
Very useful - my citibank credit card used to have a feature like this many years ago (I believe called "virtual card numbers"), but they got rid of it for some reason.

Though I am more likely to give my personal details to citibank than some startup. Trust is a big issue with payment startups.

31
nikolay 1 hour ago 0 replies      
PayPal had this and killed it - stupid PayPal! Bank of America has this. Discover has this, too. CitiBank has it, too. I really hate not being able to get cash back with Privacy.com so I won't probably use it.
32
DanBlake 2 hours ago 0 replies      
There is a few of these services and they all look awesome. The issue has always been for me that I value my points/miles more than I value the convenience of not worrying about my credit card # being stolen. If I could do this with my SPG card, I would be all over it.
33
justplay 1 hour ago 0 replies      
My bank also provide this type of virtual credit card, but it is useless. It doesn't work, i tried in paypal.
34
efader 1 hour ago 0 replies      
Oh the irony, a bank that offers a burner like credit card numbers and pretends to not know the aggregate transactions using the guise of privacy

LOL

35
jdc0589 3 hours ago 0 replies      
damn. I've been wanting a service like this for a very long time. Not just for privacy of security, but hopefully so that if my banking or real credit card information changes I could just go to one place to make all my updates.

Looking forward to seeing how it looks.

36
eiopa 1 hour ago 0 replies      
ACH only :(

I want to use this, but I don't want to give you full access to my bank account.

37
leemailll 1 hour ago 0 replies      
Citi offers this feature, but not sure whether it is for all their credit cards
38
juli3n 1 hour ago 0 replies      
The is something named e-carte in France, and that is directly powered by banks :)
39
pcarolan 1 hour ago 0 replies      
Good idea. Good marketing, even if not new, this needs to happen.
40
tedmiston 2 hours ago 0 replies      
Any plans to make a physical card? Basically the multiple virtual card service you have now but in one card I can use in person, like Coin.
41
AznHisoka 2 hours ago 0 replies      
What payer name and address does the retailer see when the transaction goes through?
42
leonaves 3 hours ago 0 replies      
Love the idea, but I just wanted to shout out the logo. Best logo concept I've ever seen, and the whole branding looks great anyway. Brilliant work.
43
AJAlabs 1 hour ago 0 replies      
Some banks like Citibank do this as well.
44
chris_va 1 hour ago 0 replies      
How are disputes settled?
45
Swizec 1 hour ago 1 reply      
At first I was really really excited. This is something I've wanted for months if not years.

Then they asked for my bank username and password.

46
mtgx 3 hours ago 3 replies      
> STEP TWOWhen you check out on any website, the Privacy icon will appear in the card form.Click it to create a new card, and auto-fill the card form. Use any name and billing address you like.

> STEP THREEAfter the card is charged, we withdraw the money from your chosen funding account, similar to a debit card.

Not sure I get this. Do you have to fund an account on Privacy.com? So it's like a Paypal where you generate a new payer name every time you pay for some other service with it?

> Sensitive information is encrypted using a split-key encryption with partial keys held by separate employees, meaning no one can decrypt your data; not even us.

Umm. Pretty sure that giving your employees the ability to decrypt my data means that "you" can decrypt it.

47
ginkgotree 3 hours ago 1 reply      
Hey! Such a great idea! Any chance you guys will work with Amex soon? I use my Platinum and Delta cards for everything.
48
strange_quark 3 hours ago 0 replies      
So I should give Privacy my bank account information in the name of "security"? No thanks.
49
kozikow 1 hour ago 0 replies      
Any plans to support UK cards?
50
subliminalpanda 3 hours ago 1 reply      
Are extensions for other browsers planned?
51
serge2k 3 hours ago 0 replies      
Finally, a card for my dial up needs!

Really though, isn't something like the apple pay system a better way? You don't risk getting flagged as a prepaid card and reject, you aren't giving out your data.

52
chris_wot 3 hours ago 1 reply      
Is this for only U.S. customers?
2
On the Impending Crypto Monoculture metzdowd.com
124 points by tonyg  3 hours ago   22 comments top 6
1
tptacek 48 minutes ago 1 reply      
This was an inevitable consequence of Bernstein being one of the very few cryptographers simultaneously devoted to:

* Theoretical rigor

* Competitive performance ("new speed record for X" being a theme of his research)

* Misuse-resistant constructions and interfaces

He was doing this stuff before it was cool (he's as far as I can tell the only cryptographer to have written something like qmail) and the rest of the industry is struggling to catch up.

The list of Bernsteinisms is slightly less scary in context:

* Curve25519 and EdDSA are sort of related work, and the alternative in non-DJB cryptography would be "NIST P-curve and ECDSA over that P-curve". The consensus seems to be that Edwards curves are superior for a bunch of practical reasons, and Bernstein pioneered them, so: no surprise.

* Poly1305 and ChaCha20 are virtually never used independently; they can been seen as DJB's most current AEAD construction. There are plenty of competing AEADs; in fact, there's a competition (CAESAR) underway that DJB is involved in.

So really, that's not that much scarier than the fact that AES and SHA-3 share an author.

2
RcouF1uZ4gsC 1 hour ago 0 replies      
If we are talking about crypto monoculture, don't AES and SHA-3 come from Joan Daemen? Also before this, the 90's crypto was basically a Ron Rivest monoculture with RC4 and RSA. This is nothing new and I believe today's monoculture is more secure than previous ones. Also, just like DES, RSA and RCA got displaced, so will DJB's monoculture if something more secure comes along.

Basically this monoculture is a consequence that crypto is very subtle and it is often better to have 1 algorithm than everybody uses implements and studies and tries to break rather than 10 that nobody really studies.

3
tc 1 hour ago 3 replies      
Knew before clicking that this was going to be about DJB having won.

Peter Gutmann definitely has the credibility to make this critique. But saying that DJB having won is more a vote against other crypto than a vote for Dan is like saying that Git having won is more a vote against other SCMs than a vote for Linus.

Well sure, you could say that. But that would rather understate Linus' substantial contribution to thinking about version control differently and better.

Similarly DJB has won because he led the way in thinking about the crypto problem correctly. Peter basically acknowledges the underlying facts here, but seems to not want to give Dan his due.

4
oconnore 1 hour ago 1 reply      
I figured someone who knows more about this (than me) would write a good reply to this, and they did:

Ron Garret replies:

 Saying "How on earth did it come to this? strongly implies that you think that the trend towards DJBs crypto suite a problem, but you dont offer much in terms of proposals for how to solve it, or even what a solution would look like. You seem to agree that a solution would *not* look like the status quo. So what exactly are you advocating here? I submit that the impending monoculture in crypto is not necessarily a problem, any more than the monoculture in physics (what? No alternatives to GR and QM?) or climate science is necessarily a problem. Its possible that crypto has a Right Answer, and that Dan Bernstein has discovered/ invented it. If you believe that simplicity and minimalism ought to be part of the quality metric then there may be very few local maxima in the design space, and DJB may simply have found one of them. rg

5
devit 1 hour ago 3 replies      
One the solutions is to start using algorithm cascades instead of single algorithms where performance doesn't matter.

If you are using 10 ciphers or 10 hash functions or 10 signature schemes, then you need 10 different breakthroughs before it all falls down.

There is really no reason to not do this unless performance is important, and a lot of times performance does not really matter.

NOTE: obviously you need to this properly and use a different key for each cipher, concatenate hashes, concatenate signatures and so on. Also, you should start encrypting with the best implemented ciphers, so that plaintext is not leaked if the worst ciphers happen to have timing/cache vulnerabilities.

6
zaroth 27 minutes ago 0 replies      
Doesn't X25519 suffer from the same nonce-reuse issue?
3
Valve releases Steam Controller CAD geometry so you can mod it pcgamer.com
72 points by Ivoah  3 hours ago   14 comments top 6
1
jamessteininger 1 hour ago 1 reply      
They did the same for the Steam VR motion controllers. We call them the doughnut sticks for fun at the office, but they are really incredible to use. It was a huge blessing for us that Valve included the exact model for us to modify and reskin, because having 1-1 mapping of what you're seeing in the simulation and what you're actually feeling is huge. We did the same with the PlaystationVR Move controllers, which you can see in an image here: https://1.bp.blogspot.com/-TNDizq2W4eY/VvMWu0mcx_I/AAAAAAAAL... This is just a simple material change for now, but we plan on doing design modifications to the mesh data eventually. Likely we would only change the geometry of areas the hands don't touch, like the tracking sphere, so as to keep the player visual/haptic continuity.
2
mc42 1 hour ago 1 reply      
This is actually, a pretty big step forward in my opinion. It makes it feel, for the first time (to me at least), that the company is replying to the feedback of the customers. They know that people want to mod it, and rather than force them to use arcane methods to hack the current controller into their desired one, they can simply alter it digitally.

As an aside, I'd love to see a full-metal case for it, only using plastic when needed (the buttons and touchpads?)

3
kibwen 1 hour ago 1 reply      
For those who haven't gotten their hands on a Steam controller yet, I was very impressed. The software is still quite immature (par for Valve), but the haptic feedback in the hardware is immensely cool and the potential configurability is insane. It really is a great middle-ground between mouse+keyboard and traditional gamepads.
4
sixers2329 1 hour ago 0 replies      
This is actually really awesome- one of my biggest complaints with the steam controller was how the build quality of it felt a little "cheap" (i.e. light & plastic-y when compared to an xbox controller). The technology inside the controller is superb, and I think they had to sacrifice a little on build quality to bring it down to that ~50$ price range.

Very interested to see what kind of crazy improvements the community develops.

5
shmerl 1 hour ago 1 reply      
Interesting. What about firmware and the protocol? Will they open source / document it too?
6
santaclaus 2 hours ago 2 replies      
It would be pretty rad if you could get controller cases custom printed to fit your hands.
4
Citus Unforks from PostgreSQL, Goes Open Source citusdata.com
620 points by jamesheroku  9 hours ago   125 comments top 23
1
no1youknowz 9 hours ago 1 reply      
This is awesome. I have experience with running a CitusDB cluster and it pretty much solved a lot of the scaling problems I was having at the time. For it to go open source now, is of huge benefit to the future projects I have.

> With the release of newly open sourced Citus v5.0, pg_shard's codebase has been merged into Citus...

This is fantastic, sounds like the setup process is much simpler.

I wonder if they have introduced the Active/Active Master solution they were working on? I know before, there is 1 Master and multiple Worker nodes. The solution before was to have a passive backup of the Master.

If say, they released the Active/Active Master later on this year. That's huge. I can pretty much think of my DB solution as done at this point.

2
devit 7 hours ago 2 replies      
I've been unable to find any clear description of the capabilities of Citus and competing solutions (postgres-x2 seems the other leader).

Which of these are supported:

1. Full PostgreSQL SQL language

2. All isolation levels including Serializable (in the sense that they actually provide the same guarantees as normal PostgreSQL)

3. Never losing any committed data on sub-majority failures (i.e. synchronous replication)

4. Ability to automatically distribute the data (i.e. sharding)

5. Ability to replicate the data instead or in addition to sharding

6. Transactionally-correct read scalability

7. Transactionally-correct write scalability where possible (i.e. multi-master replication)

8. Automatic configuration only requiring to specify some sort of "cluster identifier" the node belongs to

3
ahachete 1 hour ago 0 replies      
Congratulations, Citus.

Since I heard last year at PgConfSV that you will be releasing CitusDB 5.0 as open source, I've been waiting for this moment to come.

It makes 9.5's awesome capabilities to be augmented with sharding and distributed queries. While this targets real-time analytics and OLAP scenarios, being an open source extension to 9.5 means that a whole lot of users will benefit from this, even under more OLTP-like scenarios.

Now that Citus is open source, ToroDB will add a new CitusDB backend soon, to scale-out the Citus way, rather than in a Mongo way :)

Keep up with the good work!

4
exhilaration 9 hours ago 3 replies      
5
gtrubetskoy 9 hours ago 4 replies      
If anyone from Citus is reading this: how does this affect your business model? I remember when I asked at Strata conf a couple of years ago why isn't your stuff Open Source, the answer then was "because revenue". So what changed since then?
6
erikb 8 hours ago 0 replies      
Unforking is a very smart decision. Postgres also has gained a lot of favour since MySQL was bought by Oracle. Altogether Citus has earned a lot of kudos for that move, at least with me, for all that may count!
7
TY 9 hours ago 2 replies      
This is awesome! Tebrikler (congrats) on the release of 5.0 and going OS, definitely great news.

Can you publish competitive positioning of Citus vs Actian Matrix (nee ParAccel) and Vertica? I'd love to compare them side by side - even if it's just from your point of view :-)

8
ioltas 22 minutes ago 0 replies      
Congrats to all for the release. That's a lot of work accomplished.
9
faizshah 8 hours ago 1 reply      
So this sounds similar to Pivotal's Greenplum which is also open source, can anyone compare the two?
10
azinman2 8 hours ago 2 replies      
I want it to be called citrus, which is what I always read it as....
11
voctor 6 hours ago 1 reply      
Citus can parallelize SQL queries across a cluster and across multiple CPU cores. How does it compare with the upcoming 9.6 version of PostgreSQL which will support parallel-able sequential scans, parallel joins and parallel aggregate ?
12
rkrzr 9 hours ago 2 replies      
This is fantastic news! Postgres does not have a terribly strong High Availability story so far and of course it also does not scale out vertically.I have looked at CitusDB in the past, but was always put off by its closed-source nature. Opening it up seems like a great move for them and for all Postgres users. I can imagine that a very active open-source community will develop around it.
13
signalnine 7 hours ago 0 replies      
Congrats from Agari! We've been looking forward to this and continue to get a lot of value from both the product and the top-notch support.
14
BinaryIdiot 7 hours ago 0 replies      
I don't have a ton of experience scaling out and using different flavors of PostgreSQL but I had run across Postgres-XL not long ago; does anyone know how this compares to that?
15
jjawssd 9 hours ago 2 replies      
My guess is that Citus is making enough money from consulting that they don't need to keep this code closed source when they can profit from free community-driven growth while they are expanding their sales pipeline through consulting.
16
ccleve 6 hours ago 1 reply      
I'd very much like to see what algorithm these systems are using to enable transactions in a distributed environment. Are they just using straight two-phase commit, and letting the whole transaction fail if a single server goes down? Or are are they getting fancy and doing some kind of replication with consensus?
17
ismail 5 hours ago 0 replies      
Any thoughts on using something like postgres+citrus vs hadoop+hbase+ecosystem vs druid for olap/analytics with very large volumes of data
18
satygeek 4 hours ago 1 reply      
Does CitusDb fit in olap analytical workloads to do aggregations on hundreds millions of records using varying order and size of dimensions (eg druid) in max of 3 seconds response time using as few boxes as possible - Or there are other techniques have to be used along with Citusdb? Can you shed a light on your experience with CloudFlare in terms of cluster size and queries perf?
19
X86BSD 8 hours ago 2 replies      
AGPL? This is dead in the water :( It will never be integrated into PG. What a shame. It should have been a 2 clause BSDL. Sigh.
20
Someone 4 hours ago 2 replies      
One must thank them for open sourcing this, and cannot blame them for using a different license, but using a different license makes me think calling this "unfork" is bending the truth a little bit.
21
Dowwie 7 hours ago 0 replies      
would a natural evolutionary path for start ups be to emerge with postgresql and grow to requiring citusdb?
22
onRoadAgain23 8 hours ago 5 replies      
Being burned before,I will never use an OS infrastructure project that has enterprise features you need to pay for. They always try to move you to paid and make the OSS version unpleasant to use over time as soon as the bean counters take over to milk you

"For customers with large production deployments, we also offer an enterprise edition that comes with additional functionality"

23
Dowwie 7 hours ago 0 replies      
is it correct to compare citusdb with pipelinedb?
5
Docker for Mac and Windows Beta docker.com
767 points by ah3rz  11 hours ago   213 comments top 55
1
falcolas 9 hours ago 9 replies      
The last time I used xhyve, it kernel panic'ed my mac. Researching this on the xhyve github account [1] showed that it was determined that it's due to a bug with Virtualbox. That is, if you've started a virtual machine since your last reboot with Virtualbox, subsequent starts of xhyve panic.

So, buyer beware, especially if said buyer also uses tools like Vagrant.

[1] https://github.com/mist64/xhyve/issues/5

I've said before that I think the Docker devs have been iterating too fast, favoring features over stability. This development doesn't ease my mind on that point.

EDIT: I'd appreciate feedback on downvotes. Has the issue been addressed, but not reflected in the tickets? Has Docker made changes to xhyve to address the kernel panics?

2
tzaman 11 hours ago 7 replies      
If I had a yearly quota on HN for upvotes, I'd use all of them on this.

> Volume mounting for your code and data: volume data access works correctly, including file change notifications (on Mac inotify now works seamlessly inside containers for volume mounted directories). This enables edit/test cycles for in container development.

This (filesystem notifications) was one of the major drawbacks for using Docker on Mac for development and a long time prayer to development god before sleep. I managed to get it working with Dinghy (https://github.com/codekitchen/dinghy) but it still felt like a hack.

3
izik_e 3 hours ago 0 replies      
We have been working on hypervisor.framework for more than 6 months now, since it came out to develop our native virtualization for OS X, http://www.veertu.com As a result, we are able to distribute Veertu through the App Store. Its the engine for Fast virtualization on OS X. And, we see now that docker is using it for containers. We wish that Apple would speed up the process of adding new Apis in this hypervisor.framework to support things like bridge networking, USB support, so everything can be done in a sandboxed fashion, without having to develop kernel drivers. I am sure docker folks have built their kernel drivers on top of xhyve framework.
4
wslh 10 hours ago 3 replies      
Can someone explain in simple terms how Docker for Windows is different from Application Virtualization products like VMware ThinApp, Microsoft App-V, Spoon, Cameyo, etc? Also, why does it require Hyper-V activated in Windows 10? I found this: https://docs.docker.com/machine/overview/ but I don't understand if you need separate VMs for separate configurations or they have a containerization technology where you are able to run isolated applications on the same computer.
5
darren0 10 hours ago 3 replies      
This is an amazing announcement, but... The beta requires a NDA. The source code is also not available. This gives the impression that this will be a closed commercial product and that really takes the wind out of my sails.
6
_query 11 hours ago 4 replies      
If you're using docker on mac, you're probably not using it there for easy scaling (which was the reason docker was created back then), but for the "it just works" feeling when using your development environment. But docker introduces far too much incidental complexity compared to simply using a good package manager. A good package manager can deliver the same "it just works" feeling of docker while being far more lightweight.

I've wrote a blog post about this topic a few months ago, check it out if you're interested in a simpler way of building development environments: https://www.mpscholten.de/docker/2016/01/27/you-are-most-lik...

7
rogeryu 11 hours ago 3 replies      
> Faster and more reliable: no more VirtualBox!

I'm a Docker n00b, still don't know what it can do exactly. Can Docker replace Virtualbox? I guess only for Linux apps, and suppose it won't provide a GUI, won't run Windows to use Photoshop?!

8
rocky1138 11 hours ago 1 reply      
"the simplest way to use Docker on your laptop"

I think they forgot about Linux :)

9
mwcampbell 8 hours ago 1 reply      
Interesting to see that at least one of the Mirage unikernel hackers (avsm) has been working on this.

https://news.ycombinator.com/item?id=11352594

I imagine a lot of this work will also be useful for developers wanting to test all sorts of unikernels on their Mac and Windows machines.

10
philip1209 6 hours ago 1 reply      
Does anybody have any guides on setting up dev environments for code within Docker? I recall a Dockercon talk last year from Lyft about spinning up microservices locally using Docker.

We're using Vagrant for development environments, and as the number of microservices grows - the feasibility of running the production stack locally decreases. I'd be interested in learning how to spin up five to ten docker services locally on OSX for service-oriented architecture.

This product from Docker has strong potential.

11
nzoschke 11 hours ago 1 reply      
Very excited about this. Docker Machine and VirtualBox can be a rough experience.

> Many of the OS-level integration innovations will be open sourced to the Docker community when these products are made generally available later this year.

Does this mean it is closed right now?

12
mathewpeterson 5 hours ago 0 replies      
I'm really excited to see this because I've spent the last few months experimenting with Docker to see if it's a viable alternative to Vagrant.

I work for a web agency and currently, our engineers use customized Vagrant boxes for each of the projects that they work on. But that workflow doesn't scale and it's difficult to maintain a base box and all of the per project derivatives. This is why Docker seems like a no-brainer for us.

However, it became very clear that we would have to implement our own tooling to make a similar environment. Things like resolving friendly domain names (project-foo.local or project-bar.local) and adding in a reverse proxy to have multiple projects use port 80.

Docker for Mac looks like it will solve at least the DNS issue.

Can't wait to try it out.

edit: words

13
Lambent_Cactus 8 hours ago 9 replies      
Tried to sign up, but the enroll form at https://beta.docker.com/form is blank for me - it just says "Great! We just need a little more info:" but has no forms.
14
totallymike 10 hours ago 1 reply      
I'm delighted to read that inotify will work with this. How's fs performance? Running elasticsearch or just about any compile process in a docker-machine-based container is fairly painful.
15
alexc05 6 hours ago 0 replies      
I cannot wait to get home to play with this!

If I were a 12 year old girl I would be "squee-ing" right now. Ok, I'm lying - I'm a 40 year old man actively Squee-ing over this.

:)

It really plays nicely into my "weekend-project" plans to write a fully containerized architecture based in dotnet-core.

16
f4stjack 10 hours ago 2 replies      
So, let's say if I am developing a Java EE app under windows with eclipse and want to use docker container for my app, how do I go about it?
17
raesene4 10 hours ago 1 reply      
This is v.cool, although for the Windows version it'd be great if it became possible to swap out the virtualization back-end so it's not tied to Hyper-V.

At the moment VMWare Workstation users will be a bit left out as Windows doesn't like having two hypervisors installed on the same system...

18
danbee 3 hours ago 1 reply      
I couldn't sign up using Firefox on Windows. I'd enter a username, email and password then the form would just go blank on submission.
19
AsyncAwait 4 hours ago 1 reply      
Why does signing up for the beta require agreeing to a non-disclosure agreement?
20
numbsafari 10 hours ago 0 replies      
I'm really hoping that this will be available via homebrew and not a way to force everyone to use Docker Toolbox or, god forbid, the Mac App Store.

Docker Toolbox just brings back too many nightmares from Adobe's awful Updater apps.

21
nstart 11 hours ago 1 reply      
My goodness. This is some of the best news from docker this year and we are still just getting started. Packaging various hot reloading JavaScript apps will finally be possible. Gosh. I can't begin to say just how excited I am for this.
22
_mikz 11 hours ago 2 replies      
23
sz4kerto 11 hours ago 1 reply      
Can some Docker employee explain how are file permissions going to work on Windows? For me, that's the biggest pain (on Win).
24
evacchi 8 hours ago 1 reply      
I wonder if (and hope that!) this fixes the issues[1] with (open)VPN. I can't use xhyve (or veertu) at work because of this.

[1] https://github.com/mist64/xhyve/issues/84

25
jtreminio 9 hours ago 0 replies      
I run my stack(s) on Vagrant with Puppet for provisioning. I use OSX, but one of the major pain points of working with Linux VMs on a Windows host are file permission issues and case insensitivity.

I don't think Docker can do anything about case sensitivity, but with this new release will permissions differences be handled better?

26
alfonsodev 11 hours ago 2 replies      
Biggest problem with Boot2docker was volume mounting and file permissions, hope this happens soon.> Volume mounting for your code and data: volume data access works correctly, including file change notifications (on Mac inotify now works seamlessly inside containers for volume mounted directories). This enables edit/test cycles for in container development
27
Grue3 5 hours ago 0 replies      
I really want to try this, but I'm unable to register. At the page where it says "Create your free Docker ID to get started" after I click Sign Up, the page just refreshes and my chosen ID becomes blank with no indication of what's wrong. I've chosen several different IDs and neither of them worked. Browser is Firefox 45.0.1 on Windows 7.
28
jnardiello 9 hours ago 1 reply      
To be entirely honest, I'm quite concerned about your choice on choosing Alpine as the base distro. Their choice of using musl over glibc might be cool but if you have to put old libs inside a container, it's hell (if not entirely incompatible).
29
ruipgil 11 hours ago 0 replies      
Finally, I really hated the additional complexity and gotchas that boot2docker carried.
30
slantedview 7 hours ago 1 reply      
I've been running docker-machine with a VMWare Fusion VM with VT-x/EPT enabled and am using KVM inside my containers to dev/test cloud software. I'd be interested to know if I can still get the performance of Fusion and the support I need for nested virtualization out of Docker for Mac.
31
contingencies 2 hours ago 0 replies      
Not-news (support for two new hypervisors implemented, already dodgy package altered) voted up to 718 points. God you people are sheep. I guess what we take from this is docker is getting desperate for newslines.
32
nikolay 7 hours ago 0 replies      
I've always wondered about invites for open-source projects... that don't even open-source...
33
mrfusion 4 hours ago 0 replies      
Would this be a good way to deploy a program based on opencv to nontechnical users? So far I haven't found a good way to do that
34
girkyturkey 5 hours ago 0 replies      
Finally! I've spent the last month or so on Docker to learn about it as I am somewhat new in this environment. I'm just excited to try it out and have a more broad range of tools.
35
pokstad 8 hours ago 0 replies      
Funny this appears today, I just discovered Veertu on the Mac App Store (http://veertu.com) 2 days ago and love it. It also uses OS X's new-ish hypervisor.framework feature to allow virtualization without kernel extensions or intrusive installs.
36
bradhe 9 hours ago 0 replies      
This is amazingly cool. We've been using docker at Reflect (shameless: https://reflect.io) since we started it and even if we didn't have all the cgroups features, it'd be super helpful just to be able to run the stack on my laptop directly instead of having the Vagrant indirection.
37
silvamerica 7 hours ago 1 reply      
Will there be an easy way to switch / upgrade from docker-machine with vbox without having to recreate all of my images and containers over again?

I know it's a small thing, but it's kind of a pain sometimes.

38
awinter-py 5 hours ago 0 replies      
great news but I'm not sure a young startup should be wasting money on what was obviously a professionally produced launch video
39
geerlingguy 11 hours ago 1 reply      
Private beta is behind a questionnaire, just FYI. You can't, unfortunately, download anything yet unless you get an invite.
40
paukiatwee 10 hours ago 1 reply      
If I read correctly, docker for Mac is run on top on another visualization (xhyve, not VirtualBox) and docker for windows run on top of Hyper-V, which mean that it is not for production workload (at least for Windows).

So you can only use it for development. And it is close sourced. hmmm...

41
mateuszf 9 hours ago 2 replies      
When I log in and go to https://beta.docker.com/form there is an empty form and js console says:Uncaught ReferenceError: MktoForms2 is not defined
42
rikkus 7 hours ago 0 replies      
So on Windows this runs Linux in their isolated environment? I just got excited thinking it meant Windows in Windows but it looks like that's not the case.
43
ThinkBeat 5 hours ago 1 reply      
I would like to see Windows docker images.Will this ever happen? Or can I do it already?
44
partiallypro 6 hours ago 0 replies      
Kinda surprised they didn't just wait 7 days and announce this at Build with Microsoft.
45
eggie5 9 hours ago 0 replies      
Using docker on a mac always seemed to hackish b/c you had to run a separate VM. This seems like a step in the right direction and am excited to visit docker again!
46
Titanous 10 hours ago 1 reply      
Is the source code available? I don't see it at https://github.com/docker
47
d_sc 9 hours ago 0 replies      
This is great news to hear, I've been using a brew recipe that includes: brew install xhyve docker docker-compose docker-machine docker-machine-driver-xhyve to get close to what they're doing in this beta. Really looking forward to trying this out. Signed up for the beta!
48
brightball 10 hours ago 0 replies      
This is HUGE! Looking forward to trying it out.
49
tiernano 11 hours ago 0 replies      
link says its Hyper-V on Windows, but then says Windows 10 only... Anyone know if Windows Server is also supported?
50
ndboost 10 hours ago 0 replies      
shut up and take my money!
51
TheAppGuy 9 hours ago 0 replies      
Is this relevant to my app developer community on Slack?
52
eddd 10 hours ago 1 reply      
i'll finally get rid of docker-machine, THANK YOU DOCKER.
53
howfun 11 hours ago 1 reply      
Why would be Windows Pro required?
54
serge2k 7 hours ago 0 replies      
still just VMs?
55
pmoriarty 11 hours ago 1 reply      
Unfortunately, despite the title, Docker still does not run natively on a Mac or on Windows. It runs only inside a Linux VM.

From the OP:

"The Docker engine is running in an Alpine Linux distribution on top of an xhyve Virtual Machine on Mac OS X or on a Hyper-V VM on Windows"

6
Minimal cell raises stakes in race to harness synthetic life nature.com
101 points by superfx  5 hours ago   48 comments top 6
1
pak 1 hour ago 1 reply      
This article is self-contradictory.

In the second caption: "Each cell of JCVI-syn3.0 contains just 473 genes, fewer than any other independent organism."

In the main text: "In a 1995 Science paper, Venters team sequenced the genome of Mycoplasma genitalium, a sexually transmitted microbe with the smallest genome of any known free-living organism, and mapped its 470 genes."

Which is it? Venter would have seemed to disproven his own claim to novelty, unless we've found new genes in M. genitalium's genome since 1995.

Edit: As with most biological terms, part of the problem is the fuzziness of the definition of "gene". More recent studies claim M. genitalium has 525 genes [1], but that might be including tRNA and ncRNA regions. I'd still object to the article's poor editing. Also, let's get down to brass tacks here: we've only trimmed a 580kb genome down to 531kb (9% reduction). Clearly, life is already pretty damn efficient.

[1] http://www.cell.com/cell/fulltext/S0092-8674(12)00776-3

2
dghughes 4 hours ago 7 replies      
I wonder would it be possible to accelerate a synthetic cell's evolution via software?

If you already know exactly what's in the cell (you built it!) could a computer speed up its life via a software model and jump ahead to something more complex which you build as version 2?

3
Terr_ 2 hours ago 0 replies      
Often this topic makes me think of some of the fictional technology present in Deus Ex, released back in 2000, e.g:

> The cells of every major tissue in the body of a nano-augmented agent are host to nanite-capsid "hybrids." These hybrids replicate in two stages: the viral stage, in which the host cell produces capsid proteins and packages them into hollowed viral particles, and the nanotech stage, in which the receiver-transmitter and CPU are duplicated and inserted into the protective viral coating. New RNA sequences are transmitted by microwave and translated in to plasmid vectors, resulting in a wholly natural and organic process.

4
nsxwolf 5 hours ago 3 replies      
Minimal cell, or minimal genome? I don't think we're anywhere near capable of creating a synthetic cell, minimal or not.
5
horsecaptin 1 hour ago 0 replies      
Hopefully they're not creating something airborne with an unending thirst for life.
6
justsaysmthng 2 hours ago 2 replies      
> Church says that genome-editing techniques will remain the go-to choice for most applications ...

George Church is kind of a sarcastic name for a genetic scientist..

But the ethics question is more open then ever. Didn't you immediately think about how cool it would be if we could program these organisms to do stuff ? I did.Could you then program it to become multi-cellular ?

How long before this can be achieved ? 10 years ? 50 ?

However, "Because its there" is a worrying motivation to pursue this knowledge, because the pandora's box it opens is very real.

We're getting closer to the point were we can play God and achieve magical technological feats.

But are we mature enough to handle the powers that this technology bestows upon us ? Do we really need this technology now ?

7
On Let's Plays That Dragon, Cancer thatdragoncancer.com
60 points by sp332  3 hours ago   16 comments top 6
1
Joof 1 hour ago 1 reply      
How do you combat this? Realistically speaking.

Here's how: Ask major let's-play channels to explain how it's meant to be experienced firsthand and they should come watch after they've played. Not for money, but for the experience.

Toby fox did this with Undertale on game grumps and it was noted that there is a significant lag time in number of viewers that picked up later which indicates that people actually followed this advice. They trust the let's players' opinion.

In reality it probably still helps sales. I routinely buy games that are high up on twitch that I hadn't heard of before.

2
TillE 2 hours ago 1 reply      
I think it's extremely unlikely that this has a significant impact on sales.

But I understand why some developers get upset about it, in the same way Jonathan Blow was upset about piracy despite massive sales. It's a loss of control. You made a thing and you don't want people just taking it.

Emotionally, that makes perfect sense. But empirically, these factors have never been make-or-break for any game. Heavily pirated games are also huge sellers. Firewatch is doing very well right now despite being a similarly YouTube-able game.

3
tomc1985 1 hour ago 3 replies      
Hasn't anyone seen how kids obsess over Let's Play?

I've talked to several kids who almost prefer watching Lets Play over playing any given game. That South Park episode was spot on.

Developers should be alarmed!

4
slavik81 1 hour ago 0 replies      
For many games, a recording of someone playing it is so different from the actual game itself that I strongly support giving rights to streamers. However, their point is well-made. This game is more similar to a movie than it is to, say, Minecraft. It's reasonable to reflect that in how recordings are treated.

Their game is not particularly interesting to me, but it seems they made something unique. I hope it eventually pays off for them.

5
striking 20 minutes ago 0 replies      
Chances are that the people who are just watching the LP version of your game won't buy it.

Sorry, but it's true.

6
vacri 44 minutes ago 0 replies      
Well, when you make a product that's more like a movie than a game (two-hour linear story where you experience everything by watching a video), then the public will treat it more like a movie than a game. It's even priced like a movie - $15 for 2 hours of content. The problem isn't the Let's Play world, the problem is that the product is an edge case. The article sort of skims right by this issue while recognising it: And for games with more expansive or replayable gameplay, it can directly benefit developers. There's your problem right there - you 'made a movie, not a game', and released it through gaming channels.

Another point is that there are literally tens of thousands of games out there, and people use Let's Play not just for entertainment, but to see if a game is any good before purchasing. Essentially they use it for demo-ing the product, and if your game and your 'demo' are essentially the same thing, then that's a bad business decision, regardless of how much soul you poured into the product. Harsh, but if you're complaining from a business point of view, then you have to deal with harsh realities.

For my own experience, I saw it crop up, and thought "Why would I want to spend $15 for a short game about cancer?". I was curious about it, but not so curious as to drop $15... or even search out a Let's Play myself. I imagine most of the Let's Play audience were similar - suggesting that $1/viewer is somewhat realistic recovery for lost sales... that's nonsense, in my opinion.

Finally, a bonus problem: When your game is two hours long and you distribute through Steam, which has a no-questions-asked refund policy for 2 hours or less of gameplay, then a lot of those casual viewers you're after will just take that money back.

8
Justin Time ycombinator.com
117 points by melvinmt  5 hours ago   42 comments top 14
1
paul 4 hours ago 4 replies      
Justin doesn't just work with the media. Justin IS the media.

I think it's worth pointing out that Justin.tv led to the two largest YC exits to date, Twitch (Justin.tv pivot), and Cruise (founded by Justin.tv cofounder Kyle Vogt and Justin's brother Dan).

In 2008, Justin.tv co-founder Michael Seibel found Brian Chesky crashed on the floor of a hotel in Austin, offered him space in his room, began coaching him on how to build a startup, introduced him to the rest of the Justin.tv team, and ultimately brought them into YC! (and look where he is now: https://www.youtube.com/watch?v=hP6TH3pBPi8)

I look forward to more of the same :)

2
sahara 4 hours ago 2 replies      
Congratulations to Justin, and to Sam/YC for making a great choice.

On a tangential but related note, does anyone recommend any good follows on Snapchat, particularly anyone discussing tech/entrepreneurship? (There are obviously tons of entertaining celebrities, athletes, musicians, etc, but that's not really relevant here.)

Mark Suster (msuster on snap) of Upfront Ventures/bothsidesofthetable is good for at least a few 'snapstorms' a week covering a wide variety of VC topics. I've also really been enjoying the stories from Bobby Kim (bobbyhundreds), co-founder of seminal LA streetwear label The Hundreds, which tend to be a mix of standard day-in-the-life Snapchat fare as well as more introspective reflections on life and business. Bobby's a smart guy with diverse interests, even if you don't care about skateboarding or which overpriced, limited-edition collaboration is responsible for today's line out the door somewhere on Fairfax, it's still worth checking out Bobby's snaps.

3
jypepin 5 hours ago 0 replies      
The only reason I open snapchat daily is to watch Justin's stories. Nothing else.He's entertaining and has a good mix of fun stuff, great life advices and answers interesting startup questions.
4
jamesblonde 40 minutes ago 0 replies      
Is nobody else surprised that a techie (Sam Altman) can use the term PR without qualifying that he doesn't mean github. In Swedish it's called 'yrkesskadad' - occupational damage.
5
jedberg 4 hours ago 0 replies      
Hah! Just yesterday I was talking to someone about what an amazing stage presence Justin has. Couldn't have picked a better person.
6
iLoch 2 hours ago 0 replies      
I've been following Justin for a while on Snapchat. He's always doing something interesting and offering helpful advice, as well as just providing some insight into the culture of YC. I think he's a great choice for managing PR at YC, seeing as he's already been doing that unofficially for a while now.
7
devy 3 hours ago 0 replies      
Congrats to Justin! Here is a clip he talked about himself his startup experience.

https://www.youtube.com/watch?v=CGvS3NvuLWU

8
OoTheNigerian 4 hours ago 1 reply      
Seconded.

I've interacted personally with him (he may not remember) and I would say he is quite thoughtful and effortlessly able to empathize and interact excellently.

My buddy in this YC batch has nothing but good words to say.

The YC network is a defensible asset I believe PG never saw coming.

Congratulations Justin!

9
m0th87 5 hours ago 0 replies      
I was a $BIGCO intern in 2011. I only hung out with Justin briefly at that time, but he more than anyone else got me into the startup scene. He came off to me as both rawly honest and friendly - attributes that are usually mutually exclusive. He'll make an awesome spokesman.
10
aleyan 5 hours ago 1 reply      
Congratulations to Justin Kan.

It is interesting that Snapchat is developing in a direction where people are promoting it for use in communicating with them in a business context. When calling people to follow someone on Snapchat you should include their username or QR code though.

11
johnlbevan2 3 hours ago 0 replies      
I heard Justin Time was lazy... though is always there when needed.
12
mkoble11 4 hours ago 0 replies      
have been following him on snapchat - this makes total sense. another great move, guys.
13
p4wnc6 1 hour ago 0 replies      
This must clearly have come on the heels of YC being turned down by Brandon Content.
14
ladon86 5 hours ago 2 replies      
> Follow him on Snapchat

What's his username?

9
Show HN: Teleport SSH for Clusters and Teams gravitational.com
104 points by twakefield  4 hours ago   29 comments top 11
1
gfloyd 4 hours ago 1 reply      
This looks like a really cool project. I'm excited to see it develop.

How would authentication work with configuration management? I see that new nodes are authenticated with a one-time token generated from the auth server, but that seems like it could be tricky to implement in a dynamic cluster (like an AWS auto scaling group).

2
old-gregg 4 hours ago 3 replies      
Hello everyone, the Teleport team is here to answer any questions.

Internally we use Teleport as a library to connect multiple clusters into a structured system of doing ops with solid identity management, but we figured it deserves to be its own tool, especially because so many larger companies in the Valley have built something similar internally.

4
pritambaral 4 hours ago 1 reply      
This solves a problem we were looking at my last job: recording and replaying sessions. Identity management and bastion setup are solved problems, but it is nice to have an all in one package.

Of course, this raises a few security questions:

1. Do I have to run this as a server on every host I intend to ssh into? Or can it use existing installations of openssh for that?2. Is this re-inventing any authentication mechanism? If yes, how robust is it and how thoroughly has it been tested? (I'm guess not much right now, since this isn't production ready yet, but the question will remain for a while.)3. Do I have to use a different client? Or are existing ssh clients fully sufficient? The article does mention compatibility with OpenSSH, but does not detail. It also mentions using HTTPS as a transport instead of SSH, which is concerning in the case of compatibility.

5
s0l1dsnak3123 4 hours ago 1 reply      
This looks great. How feasible would it be to have server and tag data synchronized between Teleport and AWS?
6
Shamiq 4 hours ago 0 replies      
Awesome project! This is a step in the right direction for better access management.
7
pmuk 3 hours ago 1 reply      
Is this compatible with deployment automation systems like Ansible? Do you have any plans to write an Ansible Galaxy role for installation?
8
felipebrnd 3 hours ago 1 reply      
Look like an amazing tool to have.

With it one would be able to connect only through wweb console ? (couldn't find it on the docs)

9
microcolonel 48 minutes ago 0 replies      
One thing of note: though they do say it's "fully compatible with OpenSSH", it is not compatible with ed25519 keyed SSH, as Golang SSH does not support it.

Otherwise looks like a cleverly designed system. Being able to use a standard terminal emulator to connect would be nice though.

10
ukd1 3 hours ago 2 replies      
Looks awesome! does it support mosh by any chance?
11
peterwwillis 3 hours ago 1 reply      
So, they implemented Active Directory/RADIUS, a terminal proxy, screen, and a web gui? Does this not seem to anyone else like a weird mix of features for one tool?
10
American Big Brother: A Century of Political Surveillance and Repression cato.org
61 points by kushti  4 hours ago   6 comments top 3
1
yuhong 58 minutes ago 0 replies      
I think part of the problem in the US is the culture where eg police departments compete for resources, basically to make their own department bigger. One of the ideas in the Ron Paul movement was that governments needs to be smaller. Of course, in the real world it probably should not go as far as Ron Paul suggests, but....
2
cubano 2 hours ago 0 replies      
Mencken once said that many Americans, due to their Puritan roots, live in "The haunting fear that someone, somewhere, may be happy."

This surveillance and its informants, such as are discussed in this article, seem to me to be the embodiment of this fear.

Many people, for whatever reasons I will never be able to fathom, seem to get some sort of thrill on snitching on the activities of others whom I guess they feel are enjoying themselves too much.

Maybe its envy...maybe its some authoritarian streak... I don't know, but it sure exists and many of my old friends have been bitten by its ugly bite.

3
tedks 1 hour ago 3 replies      
Trust a Cato article/infographic/whatever to totally ignore the red scare and rightist motivation for virtually all political surveillance and repression.
11
Google Nik collection now available for free google.com
75 points by Numberwang  4 hours ago   6 comments top 4
1
barney54 2 hours ago 1 reply      
This is a real bummer because it implies they are done developing it.

I bought the Nik Collection a couple years ago and I have gotten good value out of the product. I hoped they would continue to improve it.

If you like photography should should definitely download it.

2
adrianmacneil 27 minutes ago 0 replies      
Awesome! I really like their HDR tool, although it was pretty much the only one I used.

For anyone interested in the history, this product was the result of an acquisition by Google (I think they wanted the team for their Google+ Photos product). While they don't seem interested in developing it further, it's nice that they continue to make the software available.

[0] https://en.wikipedia.org/wiki/Nik_Software[1] http://bits.blogs.nytimes.com/2012/09/17/google-buys-nik-to-...

3
r1ch 35 minutes ago 0 replies      
Never heard of these. How do they compare to Lightroom's built in features?
4
zeveb 1 hour ago 1 reply      
Only supports Macs & Windows, sadly.
12
The Immobile Masses: Why Traffic Is Awful and Public Transit Is Worse vice.com
125 points by sageabilly  7 hours ago   262 comments top 24
1
soyiuz 5 hours ago 15 replies      
Two footnotes on this article:

1. The problems it highlights can be reduced to low tax revenues. Our infrastructure is crumbling because we do not collect enough taxes to subsidize it. Things like roads and trains cannot (and should not) pay for themselves---they are a public good. The train might be empty at night, but the ability to take a train home prevents drunk driving, for example. One cannot put a monitory value on services like that, they speak to our collective quality of life. The SF transit situation is the direct consequence of a failing tax base. The wealth of the local tech industry is not "trickling down" to improve city infrastructure, in proportion to the industry's growth.

2. Re: the conversation about walk-ability of cities. The key concept here is density. We need to value density as it allows for more compact living. Instead, municipalities in places like the Bay Area consistently vote against new construction and against zoning laws that would allow for taller, more densely populated buildings/neighborhoods. The law of supply and demand says increase the supply of housing to make something affordable. This is not some mysterious process: there's simply no political will on the part of existing inhabitants to "devalue" their residences by increasing the supply in the housing market.

2
massysett 6 hours ago 3 replies      
This is not at all illuminating and is just a typical advocacy piece for more transit funding.

Like most of these pieces, it compares road spending and transit spending as though this is somehow a useful comparison. It isn't. For rail transit systems, spending includes all labor and capital expenses--train operators, cars, electricity, etc. Road spending does not include the enormous capital investment that citizens and businesses spend for motor vehicles.

Putting that problem aside for a moment, it says that transit gets a smaller share of the funding pie. So what? Roads blanket the nation--I don't think this article would suggest operating public transportation to compete with every road.

Then there's the usual "OMG induced demand": "road building as already mentioned does nothing to combat traffic" because of induced demand. This is specious. Yes building roads encourages people to go places. That's the point.

To top it off the piece says nothing about "why traffic is awful".

3
dkopi 6 hours ago 5 replies      
The best method of transit is walking.A lot of the problems with traffic and public transport are solved when we invest in walk-able cities. Cities where you can live close enough to work to walk there, close enough to your friends, close enough to the grocery store, your neighborhood bar or your kid's school.

No discussion of Mass transit or "giving up your car" is completely without discussing the walk-ability of cities.

4
rsync 5 hours ago 7 replies      
Can we just come out and admit something ?

Buses are terrible. They are terrible functionally, they are terrible aesthetically, and they are terrible logistically.

In fact, I would go so far as to say that across a broad spectrum of preferences, you would be very hard pressed to find anything worse in the urban, built environment than some big, loud, lumbering, clumsy (and usually) sooty bus bungling about the place.

I love good public transit. I love light rail. I love the subway. I will do anything not to ride a bus. I am reminded of that quote from steve jobs about the touchscreen phones and the stylus:

"if you see a (bus), they blew it."

5
verg 5 hours ago 2 replies      
Costs are a major part of the problem. US rail construction costs are by far the most expensive in the world [1]. Other countries are able to build rail at costs in the $100-250 million per km range (even in dense cities). The East Side access project in NYC has costs around $4 billion per km. Los Angeles has much better costs in the $400-500 million per km range [2]. Its hard to imagine the US will be able to build much transit at those costs.

From 2012[3]: "When asked by transit blogger Benjamin Kabak about its high construction costs, Michael Horodniceanu, president of the New York City Metropolitan Transportation Authoritys capital construction division, gave a two-word answer: work rules. Citing the example of the citys revered sandhogs, he said the MTA employs 25 for tunnel-boring machine work that Spain does with nine."

[1] https://pedestrianobservations.wordpress.com/2011/05/16/us-r...[2] https://pedestrianobservations.wordpress.com/category/transp...[3] http://www.bloombergview.com/articles/2012-08-27/labor-rules...

6
BurningFrog 5 hours ago 1 reply      
Traffic is awful because road owners aren't charging for access.

From an Economics standpoint, congested traffic is the same phenomenon as the old Soviet bread lines. A underpriced good is inaccessible in practice, since supply is way lower than demand at that price.

The solution is "Road Pricing", where drivers pay to drive. The price varies depending on what road, time of day etc. Maximum revenue should coincide with maximum throughput, giving everyone (ready to pay) a smooth and fast commute. It also provides incentives to build more roads where they are mostly needed.

7
FreedomToCreate 6 hours ago 2 replies      
Cities need to prioritize walking, biking and transit. What this means is that, these modes of transportation need to be made safer and faster. Currently walking through a major city is bogged down by the number of intersections you have to wait at. One idea would be to make intersection movement faster for pedestrians and transit during all hours except the morning and evening rush hour, during which vehicle movement should be prioritized to get cars off the roads as quickly as possible.
8
Tiktaalik 5 hours ago 1 reply      
The declining gas tax problem is going to get worse as electric cars increase in popularity. The solution is comprehensive road pricing, where a larger share of the real costs of road infrastructure and parking infrastructure are borne by the users.

The added benefit of correctly pricing driving is that people will make more informed decisions about where they live and how they get to work, that will result in more compact communities and less urban sprawl.

9
louprado 5 hours ago 0 replies      
To expand upon the specific discussion of the OP, I feel we are witnessing a historical pattern: urban-flight -> under-valued urban real-estate -> then urban renewal and economic opportunity -> influx of homeless and criminals since high-population density is good for both -> then public criticism that the cops are too heavy handed -> cops less likely to enforce + strain on infrastructure and services (like mass transit) + urban unrest due to economic disparity -> urban-flight -> ...

If the problem is that the BART is operating beyond capacity, adding capacity might not matter if the population is set to decline for the other reasons stated.

10
supergeek133 5 hours ago 2 replies      
I live in Minneapolis, we have the light rail in the downtown area that runs all the way down to the Mall of America and the airport. It also runs to St. Paul. We subsidize it heavily, but it is also an honor system for paying for it (no turnstiles). We also have a pretty decent bus system.

Problem is the further away you get from the city center the worse it gets. They've talked about putting in a light rail line to the southwest suburbs at a cost of billions of dollars, meanwhile the core roads don't get additional help and are perpetually bad because of the winter.

It's a balancing act, and at some point everyone needs to decide who's lifestyle is more important from a priority perspective IMO. If I decide to live in the suburbs, and commute an hour a day, is a dollar more important for that person? Or the person closer to the city core that wants more mass transit options?

Honestly I can go both directions. I currently have a 10 minute commute (by car). But I've also had the 60+ minute commutes for jobs. I also use the light rail to get to the airport pretty frequently. However I never used mass transit to get to work because I like to be able to leave when I want, and go anywhere I want as needed from work.

11
greggman 5 hours ago 3 replies      
I'd really like to know what the true costs are.

Are Hong Kong, Seoul, Tokyo, Singapore all massively subsidizing their mass transit? Do they have enough riders that they're profitable? Are they more or less efficient in how the manage them?

How about Amsterdam, Copenhagen, Stockholm, Antwerp, Koln, Barcelona, etc... which are all an order of magnitude smaller than the previously mentioned cities but all have pretty good public transportation.

These last cities are all on the same order of size as SF. About 1 million people each and yet they have vastly better public transportation than SF

12
CalRobert 5 hours ago 1 reply      
One of the biggest issues is that we force businesses and housing to have ridiculous amounts of free parking, which means that instead of dense areas where buildings can be next to each other we have a sea of asphalt with a building sprinkled here and there. In central areas these minimums amount to parking welfare for suburbanites.

People say to me "but it takes 90 minutes to go 15 miles on transit!!" - the problem isn't that transit should be faster, it's that you shouldn't have to go through 15 miles of primarily asphalt hellscape to get to basic amenities!

The core of Dublin is less than two miles across. I used to live in the middle of it and never missed a car. The core of San Diego.. well it's not really a core, and even then it's routine to have to travel several miles for basic errands, even if you live fairly close to downtown. A downtown that is prevernted from growing by parking minimums. Of course, if you ask a potential employer about transit access they look at you like you're from Mars (and, of course, choose to move on to a less hippie-ish applicant)

13
trhway 5 hours ago 1 reply      
Beside everything else, i can't get a dog or cat on public transit here. Thus i have to have a car. In Russia, when i wasn't able to afford car, my cat rode bus, train, subway with me when we had to get him somewhere. Of course, i'd get a car there too the moment i could afford it, yet public transit was a feasible alternative when i didn't have a car.
14
pklausler 5 hours ago 2 replies      
Nearly perfect for me are (1) a city with great light rail, and (2) a Brompton folding bicycle for the first & last miles. I realize that this combination isn't available to all, but it's awesome.
15
mrfijal 6 hours ago 1 reply      
article is written in absolutes (talking about public transit and traffic in general) but happily ignores the fact that there is a world outside of america. maybe looking at places that are better commute-wise than sf despite being a lot poorer would be a start?
16
thatfrenchguy 5 hours ago 2 replies      
Let's not forget that BART also made bad engineering choices, like non-standard tracks and trains that probably cost a lot of taxpayer money...
17
merraksh 5 hours ago 0 replies      
The money that funds mass transit [...] comes from a mix of four sources: [...] On the federal side, most of that money comes from the federal gas tax: 18.4 cents on a gallon of regular gas 24.3 cents on the gallon for diesel, [...] 19 percent going to mass transit. Thats rightmass transit depends on people driving cars for a significant portion of its federal funding.

I don't find this counterintuitive: the more people use their car, the more the mass transit system is strengthened and more capable to ease car traffic. Maybe it's far fetched, but it's like tax on cigarettes to finance lung cancer research.

18
Spooky23 2 hours ago 0 replies      
Public transit sucks because the quality of governance has corkscrewed down as the media gets weaker and dumber. This is literally the best time in history to be an inept and/or corrupt politician -- nobody is watching.

Roads suck because capital is easy to come by, so real estate squatters control central business districts. It's cheaper/easier/more convenient to build commercial space in he burbs.

19
blizkreeg 3 hours ago 0 replies      
I read someplace that a large part of the budget that is allocated to local government agencies in SF goes to pay pensions and very generous benefits of its employees. How true is that?
20
sna1l 4 hours ago 0 replies      
The fact that we don't have a couple operators instead of a driver for each individual train on BART is beyond me. Look at all the private companies (Magic Bus, etc) that are sprouting up because of how terrible our public transportation is.

BART essentially holds a monopoly on our transport. I believe their contract states even during a strike, new drivers have to be trained for 6 months before they are allowed to drive BART trains. There is absolutely no way that it takes 6 months to learn how to drive an AUTOMATED train. There should be public bids submitted from private companies to run on these railways, which would lead to better service and lower costs.

21
pdq 6 hours ago 4 replies      
The solution in the future will be driverless cars, shared driverless vans, etc. These should improve overall commute times, since there will be fewer accidents and denser transportation. Also people will be able to spend their commute time reading books/news, doing work, watching videos, or other recreation, rather than driving the car.
22
cowardlydragon 2 hours ago 0 replies      
Power assisted bicycles that take minimum effort to go 20mph would be amazing
23
marknutter 3 hours ago 1 reply      
We created this wonderful thing called the Internet that allows us to work and collaborate with each other from anywhere in the world, yet we all still cling to the silly idea that we need to continue to expand and repair our physical transit infrastructure so we can all travel two ways every day to sit next to somebody in some office to... stare at a computer connected to the Internet for 8 hours. It becomes even more absurd in areas like San Francisco where there literally isn't even enough housing to fit everyone.

We could solve our transit infrastructure woes overnight with policy. Give a tax break to companies who have remote workers. Either that or charge people to use public transportation infrastructure on a supply/demand basis. If more people use a freeway to get to work, the cost to use it goes up, and if you don't use it at all, you don't pay a dime. This would force companies who require their workforce to be physically present to pay higher salaries to cover the cost of commuting, which may cause them to re-evaluate remote work.

Commuting to and from work really only makes sense if you are interacting with things you can't take home with you.

And before you jump in and start spreading FUD about remote work, consider this: if it suddenly became illegal to require employees who could do their work remotely to come into a physical office every day, would businesses simply shut down? Or would they figure out a way to make it work? I'm guessing they would figure out a way to make it work.

24
stevewilhelm 4 hours ago 0 replies      
> In fact, planners and economists call road building induced demand because it encourages people to hop into their cars instead of walking or taking mass transit.

In the Bay Area, building transit infrastructure is expensive and time consuming.

Case in point: a five mile extension including one new train station of an existing BART line costs $890 million and took seven years to build. [1] The resulting extension is expected to increase in ridership by 5000 to 7000 daily trips in the next decade. [2]

There are currently 400,000 daily automobile trips from East Bay to and from Santa Clara County. [2]

[1] http://www.bart.gov/about/projects/wsx[2] http://www.bart.gov/about/projects/wsx/chronology

13
$50B transit proposal would boost light rail throughout Seattle region seattletimes.com
13 points by jseliger  1 hour ago   7 comments top 2
1
LAMike 23 minutes ago 2 replies      
50 Billion could buy a lot of self driving cars in 5 years.... why not just spend it on city owned cars that taxpayers can order on demand?
2
serge2k 1 hour ago 1 reply      
is this system going to be anywhere close to the actual needs of the region in 25 years?

Seriously, 25 years out. That just seems ridiculous to me.

14
Stencila Spreadsheet-like live reactive programming environment stenci.la
137 points by anu_gupta  7 hours ago   83 comments top 24
1
nokome 5 hours ago 2 replies      
Hey, Stencila developer here. Thanks to the original poster for sharing the link and for all the interest. Unfortunately, the site is not handling all that interest too well (R session hosting instances filling up, timeouts,...). I'm working on it but please bear with me while I try to stabilize things.
2
AndyMcConachie 6 hours ago 5 replies      
Lotus 1-2-3 got people to use computers. One could argue that spreadsheets are the single biggest business innovation of the late 20th century. I don't think they're going away that easily.

I know people who type whole letters in Excel. I know an accountant in particular who I showed how to use MS Word for writing letters, but he prefers Excel. He writes full documents in Excel, prints them out and mails them.

Laugh if you want but spreadsheets are not going away in my lifetime.

3
gavinpc 41 minutes ago 0 replies      
Spreadsheets somehow hit a vein with the public, so that tells you something. I've been kind of obsessed with the Tup build system, and in trying to distill what it does into one sentence, I find it helpful to say, It's like a spreadsheet for files, where the formulas are shell commands. Even though I can grok FRP on its own, I still find that analogy helpful.
4
mattbowen 6 hours ago 2 replies      
I've been shifting a bunch of data analysis out of spreadsheets (and some adhoc SQL) to Jupyter/Pandas, and we've found some unexpected tradeoffs on both sides.

The lack of testability and version control (and really, long-term maintainability) is what drove us out of spreadsheets into Jupyter. We've found though that the workflow in Jupyter, even with Pandas, is not great for exploring and getting a feel for data --- we end up missing the very quick "poking around" you can do in excel or in a good sql client.

I'd have to use Stencila more to know if it strikes the right balance for the kind of analysis work I do, but I'm glad to see such a thoughtful attempt to try a new balance.

5
askyourmother 6 hours ago 0 replies      
Spreadsheets can be a pain, but be careful what you ask for to replace them...

Having contracted at two big american banks (cough jp, cough boa) that decided to build their own proprietary technology sinking ship solutions (cough athena, quartz ), it is a horrible experience.

The business get sold on the idea of the "benefits" of this new "solution" so they pay lots for it. Lots and lots. Then all tech projects are guided towards it, otherwise it becomes a "political" problem if you don't use it, even if you can present a technical case.

Basically, it works, just about, if you are building the equivalent of a simple option spreadsheet pricer. Even then, the effort required should ring alarm bells. Still, job for life for those that maintain it...

6
giardini 5 hours ago 3 replies      
I think the more usual form of that phrase is "Spreadsheets are dead, long live spreadsheets!". And that captures the truth of the situation better.

Despite all the known weaknesses of spreadsheets, it is sheer hubris to believe that they will be supplanted by reactive programming (or databases, or anything else for that matter). It seems that users will give up their spreadsheets only when they are pried from their cold, dead hands.

European Spreadsheet Risks Interest Group has some great links to articles about the pros/cons/whattodo of spreadsheets:http://www.eusprig.org/

Jocelyn Ireson-Paine has done lots of work with spreadsheets and their problems. Lots of links to spreadsheet sites from there:http://www.j-paine.org/

7
dnprock 3 hours ago 1 reply      
I think notebook environments like iPython provide greater flexibility in terms of programming. I don't think they can replace spreadsheet nor spreadsheet would develop functionality to replace notebook environments.

The missing link is a seamless experience between spreadsheet and reproducible data analysis (programming).

My team is working on https://nxsheet.com. We look to provide this seamless experience. For example: Generate normal distribution - https://nxsheet.com/sheets/56e845da4030182e337c6c2b

Stencila looks interesting. Great work!

8
shaftway 6 hours ago 3 replies      
Iteration on spreadsheets is great, and there's opportunity for tons of innovation in this space. But this suffers from the same problems that most spreadsheets have. Errors like not averaging over the right set of data (which you pointed out) don't come up because the code can't be diffed in git. They come up because there's not enough clarity in whether a set of cells are consistent or not. Making this text-based doesn't add clarity because ultimately there will be too much text to read.

Here's a test for the most common kind of error. If I add data in A5 and B5, will that data be represented in the chart? How about in the averages? Will I be able to even see that?

Here's the pivot: Break your data into regions, where regions have repeated elements. Something like this:

 A1 = 'Height B1 = 'Width C1 = 'Area A2:C2 = Region { C1 = =A1*B1 }{ A1 = 2 B1 = 3 A2 = 4 B2 = 5 } B3 = 'Average C3 = =sum(C2)/count(C2)
This could be used to generate:

 Height Width Area 2 3 6 4 5 20 Average 13
The dataset associated with that sub-block can be clearly annotated to an editing user as such, with simple tools for adding a row to that dataset, or sorting it without effecting the rest of the sheet. Within the dataset, there's no corruption of the formulas (the third row's C cant be different than the second's), you've still got your diffing (probably better because it's clear that data changed but formulas different) and it's extremely hard to make the calculated average come out wrong.

Yeah, that exact representation isn't great. Maybe a "view" metaphor, so that headers and footers can be attached to the dataset instead of floating outside it. But once you've gone this direction, there's all sorts of amazing things possible by sharing datasets, linking them, transforming them, etc.

9
migueldeicaza 5 hours ago 1 reply      
I am personally a fan of Calca, not only does it do live coding, it can also do some powerful math:

http://calca.io/

10
michaelwww 4 hours ago 1 reply      
This article reminded me of an interesting video by Chris Granger, co-author of the Light Table code editor, titled "In Search of Tomorrow: What does programming look like in 10 years?" He's designing a visual/code block hybrid editor.

https://www.youtube.com/watch?v=VZQoAKJPbh8

11
steveeq1 4 hours ago 1 reply      
Has anyone here ever used Lotus Improv? Does this program solve some of the flaws that this article highlights? I keep on hearing how Improv was one of the great programs that never took off, and I'm curious to now try it.

For those of you that don't know (or weren't even born yet), Lotus Improv was a spreadsheet alternative that was released for the NeXT computer in the early '90s that was well-reviewed but never sold well. It was eventually abandoned when IBM bought lotus in the mid-90's.

12
jmj42 4 hours ago 1 reply      
It seems, perhaps, Resolver One's time has finally come. Alas, Resolver One (and Resolver Systems) died a slow death and ceased to be in 2012. Perhaps this will fare better.

I always thought Resolver was a brilliant idea, but they never gained any traction against the heavy weights (Excel).

Edit- Add link to Resolver One wikipedia articlehttps://en.wikipedia.org/wiki/Resolver_One

13
kornish 3 hours ago 0 replies      
One project I've recently been wanting play around with is Beaker, which touts itself as a polyglot data science notebook tool. The pitch is that you can use the right tool for the job, no matter what questions you want to ask next to each other. It looks like an iPython notebook on steroids.

http://beakernotebook.com/https://github.com/twosigma/beaker-notebook

Disclaimer (or not): no involvement, but looks very handy.

14
digi_owl 4 hours ago 2 replies      
I think another change is needed for spreadsheets to be a more reliable tool.

Right now a sheet is a full document thing. Meaning that it starts with A1 in top left, and spreads out from there.

but what if you could break a "sheet" down into units?

So that you can have say a constants unit, with its own A1 to whatever as needed, and then another unit, perhaps called results, that hold the formula in their own A1 to whatever, each unit movable and extendable on the screen at the same time.

Now if formula A1 wants to refer to something in constants A1, the entry would look something like constants(A1).

This way one would have everything on screen, while avoiding the worry that as the sheet grows and one move things around to make it more readable, the formulas are referencing the wrong cells.

15
debacle 4 hours ago 1 reply      
This looks very sexy, and I don't say that often about technology. This is something I've wanted to see for a long time - the ability to seamlessly move from a spreadsheet to an application

Edit: Quick feedback

Cell names appear to be case sensitive. While this makes sense from a programming standpoint, if I can't have a cell named "a1" the application should convert that to "A1" for me.

Not being able to tab between cells is annoying. Was there a decision made not to capture the tab event?

16
akhilcacharya 5 hours ago 1 reply      
This looks very similar to AlphaSheets and the thing I was making over spring break until my friends called it stupid.

Very interesting!

17
samfisher83 5 hours ago 1 reply      
Can you make this in your spreadsheet software:

http://www.quertime.com/article/arn-2012-08-22-1-25-drawings...

18
ebiester 6 hours ago 0 replies      
This kind of makes me want to brush off POI and attach it to Scala to develop a git-able way to create spreadsheets.

Create your spreadsheet, import it to an IDE, and create a way to intelligently create reports that can be connected to business intelligence suites... Someone must have done this.

19
meesterdude 4 hours ago 0 replies      
very well done! a plaintext spreadsheet. I think it has a lot of appeal, and for all the reasons outlined. Looking at it raw, is fairly understandable. Awesome! Coolest thing i've seen all month on HN.
20
robbiemitchell 4 hours ago 1 reply      
Do you think named cells and ranges in Excel / Google Sheets gets at this problem?
21
zellyn 5 hours ago 1 reply      
Misspelling Dan Bricklin's name within the first four words gives me low expectations
22
luso_brazilian 6 hours ago 2 replies      
OT and just my opinion: it seems to exists an unfortunate trend in the promotion of new ideas where, instead of simply exposing the virtues of the idea on its own, it first needs to knock down the (proven and world adopted) predecessor as something unpolished, unplanned, untested and, in general, a bad idea from the start.

Compare this trend, for instance, with the first Linux announcement by Linus Torvalds [1] below.

It is an unfortunate trend because it ends up polarizing and creating unnecessary division among the early adopters of the new idea and the current userbase of the old one.

From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)

Newsgroups: comp.os.minix

Subject: What would you like to see most in minix?

Summary: small poll for my new operating system

Message-ID: <1991Aug25.205708.9541@klaava.Helsinki.FI>

Date: 25 Aug 91 20:57:08 GMT

Organization: University of Helsinki

Hello everybody out there using minix

Im doing a (free) operating system (just a hobby, wont be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready. Id like any feedback on things people like/dislike in minix, as my OS resembles it somewhat (same physical layout of the file-system (due to practical reasons) among other things).

Ive currently ported bash(1.08) and gcc(1.40), and things seem to work. This implies that Ill get something practical within a few months, and Id like to know what features most people would want. Any suggestions are welcome, but I wont promise Ill implement them :-)

Linus (torvalds@kruuna.helsinki.fi)

PS. Yes its free of any minix code, and it has a multi-threaded fs. It is NOT protable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as thats all I have :-(.

[1] http://www.thelinuxdaily.com/2010/04/the-first-linux-announc...

23
dang 4 hours ago 0 replies      
Since that title is baity, we changed it to something more neutral in accordance with the HN guidelines:

https://news.ycombinator.com/newsguidelines.html

If anybody suggests a better title, we can change it again.

24
wrong_variable 6 hours ago 5 replies      
Spreadsheets are optimized for data entry. Try entering CVS data manually using a text editor.

Without Excel there would be no way for my sales/marketing team to give me valuable data to process at the backend.

If you have been doing programming in excel for the past 20 years then you are simply a fool - sorry.

15
Micropackages and Open Source Trust Scaling pocoo.org
252 points by s4chin  11 hours ago   77 comments top 25
1
ergothus 8 hours ago 6 replies      
I think this is a serious set of reasonable thoughts about the incident, and don't want to demean the article.

That said, I wish more people would talk both sides. Yes, every dependency has a cost. BUT the alternatives aren't cost free either. For all the ranting against micropackages, I'm not seeing a good pro/con discussion.

I think there are several lessons to be learned here (nixing "unpublish" is a good one, and I've not been impressed with the reaction from npm there) the most important of which is probably that we should change our build process: Dev should be pulling in updates freely to maintain the easy apply-fixes-often environment that has clearly been popular, then those should be pinned when they go past dev (to ensure later stages are consistent) and we should have some means of locally saving the dependencies to reduce our build-time dependency on package repos.

Sadly, though, I've not seen a lot of discussion on a reasonable way to apply those lessons. I've seen a lot of smugness ("Any engineer that accepts random dependencies should be fired on the spot", to paraphrase an HN comment), a lot of mocker ("haha, look at how terrible JS is!"), and a lot of rants against npm as a private entity that can clearly make mistakes, but not much in the way of constructive reflection.

Clearly JS and NPM have done a lot RIGHT, judging by success and programmer satisfaction. How do we keep that right and fix the wrong?

2
scrollaway 9 hours ago 4 replies      
> Sentry depends on it 20 times. 14 times it's a pin for 0.0.1, once it's a pin for ^1.0.0 and 5 times for ~1.0.0.

This is what I was mentioning in the other thread (and being called a troll for... sigh). I appreciate the idealism of "if we have micromodules, we don't have to reimplement common helper functions, which scales to thousands of bytes saved!". But in practice, there's craptons of duplicate dependencies with different versions. Which negatively scales to hundreds of kilobytes wasted. In code, in downloads, in install time, in developer time (because devs install things too. A lot more than end users in fact...), etc.

One of the many problems which means what's on paper doesn't at all correspond to what we actually get.

3
l1ambda 9 hours ago 5 replies      
The problem with standard libraries is they are a standard library. A place where good code goes to die. Standard libraries also mean you can't use the particular version of the module you need; now you are pinned to the version of the standard library comes with the version of the language you are running on. The workaround there is to fork out the standard library code into...a module. Now, a lot of these modules are designed for old JS runtimes like old versions of IE, so you wouldn't have a standard library anyway.

There's plenty of good libraries like lodash and math.js that are pretty much the next best thing to a standard library.

If your dependency tree sucks, that's a personal problem. It's not npm, JavaScript or node's fault. That's like blaming git because you pushed some crappy code.

The problem was fixed 10 minutes later anyway. This whole discussion surrounding this is a combination of knee-jerk reaction, "waah", and realization of "Oh shit, depending on external code means we are dependent on external code!"

If you want to code without dependencies, go write JavaEE. Everything is included, you don't need any third party dependencies, and you can use cutting-edge tech like JSP, app servers and JavaServer faces.

4
pjc50 9 hours ago 3 replies      
Since at least the 70s people have been trying to "componentise" software in the same way that electronics is componentised: rather than assembling something out of a pile of transistors, build integrated circuits instead. The intent is to reduce cost, complexity and risk.

This has never yet quite worked out in software. Object-orientation was part of the resulting research effort, as are UNIX pipelines, COM components, microkernels and microservices. When it goes wrong you get "DLL Hell" or the "FactoryFactoryFactory" pattern.

It looks like the javascript world has forgotten about integration and instead decided to do the equivalent of assembling everything out of discrete transistors every time. The assembly process is automated, so it appears costless - until something goes wrong.

But really this is the fault of the closed source browser manufacturers, who prefer to attempt lockin over and over again through incompatible features rather than converge on common improvements.

5
grandalf 6 hours ago 1 reply      
To quote an old adage, package size doesn't matter.

The actual issue has to do with trusting a package of any size over time. This is true regardless of whether the package implements 1 line of code or 1000.

The trustworthiness of a package is a function of several factors. Code that is not actively maintained can often become less trustworthy over time.

What we need is one or more 3rd party trust metrics, and our bundling/packaging utilities should allow us to use that third party data to determine what is right for our build.

Maybe some of us want strong crypto, maybe others of us want adherance to semver, maybe others want to upgrade only after a new version has had 10K downloads, maybe others only want to use packages with a composite "score" over 80.

On the continuum of code quality from late night hack to NASA, we all must draw a line in the sand that is right for a particular project. One size does not fit all.

It's a big mistake (as well as a profound example of bad reasoning) to blame micropackages. The size of the package has nothing to do with it. Any codebase with any number of dependencies faces some risk by trusting the maintainers or hosting of those dependencies to third parties, which is the problem we need to do a better job of solving.

6
tlrobinson 9 hours ago 0 replies      
> My opinion query quickly went from Oh that's funny to This concerns me.

This was my response as well:

> The combination of a micro-library culture, semver auto-updates, and a mutable package manager repository is pretty terrifying.

https://mobile.twitter.com/tlrobinson/status/712442098381754...

Either of the second two properties are dangerous on their own, but culture of micro-libraries compounds the problem.

7
coenhyde 7 hours ago 0 replies      
Everyone is blowing the "micropackages are the problem" completely out of proportion. The real problem with the left-pad fiasco is that someone was able to revoke a package other people depended on. Packages should be immutable.
8
askyourmother 9 hours ago 0 replies      
I heard one of js "devs" refer to npm as nano package management. It sounded more like abdication of duty as a developer to understand what you are adding as a dependency, why, and the long-term cost.

How many developers here would gladly add a rogue "dependency", like a developer they had never spoken to before, into their project without some care? And yet the willingness to open the front and literal back doors of the project to so many dependencies, like low-quality functions-as-a-module is astounding.

9
seibelj 9 hours ago 0 replies      
The large amount of python standard packages is clearly a benefit for Python. Node should start a vetting process to be included into a standard package system, and start moving in key libs, then host official doc.

I guarantee that the weaknesses of the NPM ecosystem are already known and exploited by bad actors. There are people who earn large 6 figure salaries / consulting fees for finding and exploiting these issues. This is a wakeup call that we need to do something about it.

10
zanny 5 hours ago 0 replies      
A lot of the value in these remarks is that they are coming from the author of Flask, the most popular microframework for Python, which itself has a massive extension tree that also does suffer a lot of the same problems as NPM - trying to update or maintain a Flask project often involves navigating a tremendous amount of dependency hell on all kinds of modules, from flask-sqlalchemy to flask-wtforms to flask-bootstrap to flask-oauth.. etc. The worst part is tons of these modules and extensions are dead projects that code rot for years, but when you implement everything independently in its own git tree it gets many fewer eyes upon it, as Armin mentions in the OP regarding one liner Node packages.

But it does not spiral out of control nearly as bad as any attempt at frameworks on NPM, because unlike Node almost every Flask extension depends on three things - Flask, the external package the extension attaches to( ex: WTForms) and Python's standard library.

A similar Node package would depend on possibly hundreds of tiny one liners to replace the absence of standard library.

Which gets to the heart of the problem, right? The reason I've never even considered Node is because Javascript is like PHP - a mutant language born of need rather than intent, that kind of just grew over time to fill use cases constrained by its unique position in the ecosystem rather than as what someone considered the "best answer to the job". Python (3) is almost entirely anthesis to that. Writing Python is a joy because it is designed ground up to be a paradigm to solve problems, not a problem that breeds a paradigm.

There is no way for Node to fix this as long as it tries to be browser compatible. We will never see ECMAScript standards adopt an ISOC++ stance of maturing the language with a comprehensive standard library to meet the needs of the language in the day and age it is being used, because there are very disparate interests involved in Javascripts language design going forward. That is its blessing and curse - Javascript will never grow into a Java-scale monstrosity of standard library bloat because a tremendous number of people involved in Javascript also have to implement Javascript and thus don't want a larger surface area of work to do. But it is important to remember that Javascript was never meant to be anything. It was made for dynamic HTML pages in Netscape. The fact that two decades later it is being shoehorned into web server dev and desktop applications should be scary.

11
jonstokes 8 hours ago 2 replies      
Help me understand why these micropackages exist in a world where tree shaking is a thing? Why is there no stdlib that rolls up all of the commonly used small dependencies? (I'm kind a n00b to JS so it's a non-rhetorical question.)
12
raesene4 8 hours ago 0 replies      
Good article even though I don't agree with all the conclusions.

I find a good way to think about things is that every single dependency you have adds another set of people you have to trust.

You're trusting the competence of the developer (i.e. that the library has no security flaws), you're trusting their intent (i.e. that they don't deliberately put malicious code into the library) and you're trusting their Operational Security practices (i.e. that their systems don't get compromised, leading to loss of control of their libraries).

Now when you think about how little you know about most of the owners of libraries you use, you can see possibility for concern.

The bit I disagree with the article about is signing. I personally think that developer signing is a useful part of this as it takes the repository owner out of the trust picture (if done correctly). Without it you're also trusting the three items above for the repository provider and it's worth noting that a large software repo. is a very tempting target for quite a few well funded attackers.

Docker at least has provided some of the technical pieces to address this in their repositories with content trust...

13
drinchev 8 hours ago 0 replies      
I think the problem is in npm and not in the micro-modules.

Writing isomorphic, cross-browser code in a language full of edge-cases, like JavaScript is hard.

Oneline function, but 20 lines of tests and another 20 test environments.

The solution should not come from people that write only front-end JS code. I'm waiting for a response by the libraries that were broken by left-pad.

14
mbrock 9 hours ago 2 replies      
Maybe we could start to publish signed approvals of specific package hashes.

For example: "I, mbrock, think that pad-left v1.0.3 with hash XYZ seems like an uncompromised release."

Then the tool that upgrades packages could warn when a release isn't trusted by someone you trust (or transitively via some trust web scheme).

The approval system becomes like a "release review" process.

15
zalzal 4 hours ago 0 replies      
There is a bigger debate on micropackages, for sure. But even in the sort term, breaking your build instantly every time third parties make changes is just madness. Reduce operational dependencies as well as library dependencies.

This is one approach we used to deal with this last year, for example, on build/devops side: https://medium.com/@ojoshe/fast-reproducible-node-builds-c02...

16
dec0dedab0de 9 hours ago 2 replies      
Maybe the solution for highlevel languages is to just routinely add useful helper functions, either in separate name spaces or directly to global with a naming convention to avoid conflicts. If thousands of people are doing the same thing it really doesn't make any sense for them to all come up with their own version.
17
cdnsteve 3 hours ago 0 replies      
Micro deps are the sign of something missing from the core language. We should be working to have that expanded and not have it shouldered to a package manager system and community to fill IMO.
18
dougdonohoe 9 hours ago 0 replies      
19
nikolay 7 hours ago 0 replies      
There's where git-vendor [0] comes in place!

[0]: https://brettlangdon.github.io/git-vendor/

20
debacle 9 hours ago 0 replies      
This was well written. The balance between convenience and liability is something that takes time to digest.

I don't really understand why there isn't a stdlib of these "micropackages" that can be downloaded to save a lot of effort.

21
dc2 8 hours ago 2 replies      
> Multiplied with the total number of downloads last month the node community downloaded 140GB worth of isarray.

This is not true. NPM locally caches every module the first time it is downloaded.

Therefore with widely downloaded modules such as isarray, it is very likely it has already been downloaded on the local system and so is pulled from the cache.

The actual percentage of fresh downloads from NPM in a real-world deployment is overwhelmingly small.

22
emodendroket 7 hours ago 0 replies      
It's telling that only the immature JS ecosystem thinks this is a good idea.
23
orf 4 hours ago 1 reply      
Holy crap, I had a look through https://www.npmjs.com/~sindresorhus

There are so many one line packages:

https://github.com/sindresorhus/is-finite/blob/master/index....

https://github.com/sindresorhus/is-fn/blob/master/index.js

https://github.com/sindresorhus/is-gif/blob/master/index.js

https://github.com/sindresorhus/is-github-down/blob/master/c...

https://github.com/sindresorhus/is-ip/blob/master/index.js

https://github.com/sindresorhus/is-npm/blob/master/index.js

https://github.com/sindresorhus/is-obj/blob/master/index.js

https://github.com/imagemin/advpng-bin/blob/master/lib/index...

https://github.com/chalk/ansi-regex/blob/master/index.js my favourite)

https://github.com/sindresorhus/array-move/blob/master/index...

https://github.com/sindresorhus/compare-urls/blob/master/ind...

https://github.com/sindresorhus/debug-log/blob/master/index....

https://github.com/sindresorhus/file-url/blob/master/index.j...

https://github.com/sindresorhus/fix-path/blob/master/index.j...

https://github.com/sindresorhus/fn-args/blob/master/index.js

https://github.com/sindresorhus/fn-name/blob/master/index.js

https://github.com/sindresorhus/globals/blob/master/index.js

https://github.com/sindresorhus/imul/blob/master/index.js

https://github.com/sindresorhus/is-text-path/blob/master/ind...

https://github.com/sindresorhus/is-up/blob/master/index.js

https://github.com/sindresorhus/is-travis/blob/master/index....

https://github.com/sindresorhus/is-video/blob/master/index.j...

https://github.com/sindresorhus/is-webp/blob/master/index.js

https://github.com/sindresorhus/is-archive/blob/master/index...

https://github.com/sindresorhus/is-admin/blob/master/index.j...

https://github.com/sindresorhus/is-absolute-url/blob/master/...

https://github.com/sindresorhus/ipify/blob/master/index.js

https://github.com/sindresorhus/is-url-superb/blob/master/in...

https://github.com/sindresorhus/is-tif/blob/master/index.js

https://github.com/datetime/leap-year/blob/master/index.js

https://github.com/sindresorhus/lpad/blob/master/index.js

https://github.com/sindresorhus/md5-hex/blob/master/index.js

And I ran out of willpower, at only L. Seems to me the complete lack of any decent standard library has caused this Cambrian explosion of packages, and the overhead is astounding. Sure it's appealing to google "nodejs validate ip", then run "npm install is-ip" and use it with "require('is-ip')", but fuck me how wasteful do you want to be. My frontend ember app ends up installing over 500mb of dependencies (most of which is useless test files and other redundant fluff files). How has this happened?

What's to stop one of those one line packages adding a malicious one liner that concatenates and uploads your super-secret private source code to somebodies server? You're really trusting the complete integrity of your codebase because you depend on "is-array", because you can't be bothered to write "x.toString() === '[object Array]'", and JS introspection (which seems so (ab)used) is so broken that this is needed? Woah.

24
EGreg 7 hours ago 0 replies      
Isn't this similar to broken links on the web? You can either:

1) Bundle everything in your distribution. Not unreasonable, but would be nice to have a hybrid protocol that lets the publisher store a signed copy of all the dependencies but only send them on request (so less duplication is sent over the wire).

2) Have the same as 1 but in dedicated "mirrors" and persistent distributed storage a la freenet. Files are only deleted if there isn't enough space on the network and they are the least--recently-requested ones.

25
mgrennan 8 hours ago 0 replies      
If you don't know history.....

40 years of computer experience as EE, coder, IT security, DBA tells me; when IT moved from a way to do work (science) to a thing of its own (marketing); this happened during Dot-Com bubble, time to market became the goal and security was tossed. You here this in mantras like:

Push NOW fix latter.Fail fast and often.

I say: Secure, Complete, Fast - Pick two.

16
Using Google Cloud Vision OCR to extract text from photos and scanned documents github.com
90 points by danso  8 hours ago   21 comments top 12
1
ImJasonH 6 hours ago 1 reply      
While we're talking about the Google Cloud Vision API I'll take the opportunity to plug the Chrome extension I wrote that adds a right-click menu item to detect text, labels and faces in images in your browser:

https://chrome.google.com/webstore/detail/cloud-vision/nblmo...

Try it out, let me know what you think. File issues at github.com/GoogleCloudPlatform/cloud-vision/

2
jyunderwood 5 hours ago 1 reply      
At work I replaced a [Tesseract](https://github.com/tesseract-ocr) pipeline with some scripts around the Cloud Vision API. I've been pleased with the speed and accuracy so far considering the low cost and light setup.

Btw, here is a Ruby script that will take an API key and image URL and return the text:

https://gist.github.com/jyunderwood/46b601578d9522c0e9ab

3
zurbi 5 hours ago 0 replies      
This was useful information. Testing this was on my todo list for weeks now:

I read about these limitations in the Cloud Vision OCR API docs, but could not believe that they would indeed not provide data at the word or region level. Anyone has any idea why?

I mean, they must have this data internally and it is key for useful OCR.

Currently I am using the free ocr api at https://ocr.space/OCRAPI for my projects. It also has a corresponding chrome extension called "Copyfish", https://github.com/A9T9/Copyfish

4
Mithaldu 7 hours ago 0 replies      
Submitter: If you're also the author, thank you for sharing your efforts. I needed exactly this kind of information to improve protection against cp spammers who had switched to posting images with the urls on one of my websites. I had however not been able to find out how to start using ocr apis, so this is a god send.
5
zandorg 1 hour ago 0 replies      
I found this great software (called TIRG) which is free, open source, and finds text in images (though it doesn't normalise to black / white).

Compiles fine on Windows.

https://sourceforge.net/projects/tirg/files/

6
alex_hirner 5 hours ago 1 reply      
@danso, if there are any delimiters in the output (tesseract case) and you are looking for automatic table extraction, check out http://github.com/ahirner/Tabularazr-os

It's been used with different kinds of financial docs such as municipal bonds. Implemented in pure python, it has a web interface, simple API and does nifty type inference (dates, interest rate, dollar ammounts...).

7
misiti3780 6 hours ago 0 replies      
I was recently testing out google's OCR for some PDF docs - it thought it worked really well (and is pretty reasonable priced). i didnt care so much about the structure of the response/document.
8
steeve 5 hours ago 1 reply      
We are amazingly good results using SWT[1] for text detection/boundaries and Tesseract for OCR. Pretty much on par with the results here.

We used to run this on videos.

[1] http://libccv.org/doc/doc-swt/

9
dtjones 7 hours ago 1 reply      
Seems simple and effective, thanks for sharing. What is the request latency?
10
yborg 5 hours ago 1 reply      
This is cool ... any idea what languages are supported? All I can find in the Google docs is "Vision API supports a broad set of languages."
11
sagivo 5 hours ago 0 replies      
they compare it to tesseract but i really tend to like the open source version.

a simple service that has a free plan on top of it can be found here - https://scanr.xyz/

12
thesimon 5 hours ago 1 reply      
Thanks for sharing. Did you try using it for captchas? :
17
Google parent Alphabet ushers in fiscal discipline era usatoday.com
21 points by e15ctr0n  2 hours ago   6 comments top 3
1
ChuckMcM 1 hour ago 2 replies      
Yeah, pretty much as expected [1] :-) A good friend of mine was at Cisco when this mindset hit really really hard. He said everyone learned to ask "Is what I am working on Core to Cisco's business?" because if the answer was "no" it meant you were on the way out until the answer was "yes". It certainly helped inject tension into the employees at Cisco.

[1] https://news.ycombinator.com/item?id=10037511

2
apalmer 22 minutes ago 1 reply      
i am not sure i understand how this is supposed to work, if we are being really logical in the financial discipline shouldnt like 75% of alphabet subsidiaries shut down?
3
vgeek 23 minutes ago 0 replies      
Tough news for anyone in the SEO industry. Better go ahead and get AdWords certified.
18
Advanced Linux Programming book with free PDF (2001) advancedlinuxprogramming.com
188 points by nonrecursive  12 hours ago   32 comments top 12
1
ctur 10 hours ago 2 replies      
Two alternative, and better, choices, if you're willing to spend a few dollars (and hopefully expense it to your manager):

The Linux Programming Interface: http://www.amazon.com/Linux-Programming-Interface-System-Han...

Advanced Programming in the UNIX Environment: http://www.amazon.com/Advanced-Programming-UNIX-Environment-...

A not-too-distant third choice, Linux System Programming:http://www.amazon.com/Linux-System-Programming-Talking-Direc...

2
richm44 10 hours ago 1 reply      
I'd recommend this book instead 'The Linux Programming Interface' http://man7.org/tlpi/ it's a lot more recent and the content is excellent, though it's not a free download. The author is also the maintainer of the linux man pages.
3
nickles 10 hours ago 4 replies      
>> http://advancedlinuxprogramming.com/about.html

This appears to be the first edition of the book, published in 2001. Is this book still useful, considering it's 15 years old?

4
ruraljuror 5 hours ago 1 reply      
I am getting interested in Unix programming, but essentially starting from scratch. I've read through the first two chapters of the The Linux Programming Interface, and while I think it is a great book, so far it has not been easy for me to find points of entry. I think it might not be the best book for me to start with.

I've just started How Linux Works, but I thought I might take this opportunity to ask for suggestions for someone trying to get started (doesn't have to be a book).

5
bluedino 6 hours ago 0 replies      
Is this a good read after finishing Beginning Linux Programming (big red book by WROX) by Neil Matthew and Richard Stones?
6
thatguy_2016 4 hours ago 0 replies      
It seems many people dont like this book. Why's that?
7
ak2196 10 hours ago 0 replies      
Stevens APUE is a much better choice.
8
jaguar86 7 hours ago 1 reply      
This is a good write up for IPC related stuff - http://beej.us/guide/bgipc/output/html/singlepage/bgipc.html
9
kidgorgeous 9 hours ago 1 reply      
why are all the chapters in separate pdf's? Is that unintentional or an enticement to buy the book when it comes out? If it's the former, anybody got a full pdf download link?
10
maqbool 4 hours ago 0 replies      
pretty good book
11
massysett 6 hours ago 2 replies      
Book does not understand its audience. "An editor is the program that you use to edit source code." No "advanced" programmer needs to be told that. OK, maybe if all you know is an IDE, you don't know this...but the book says it's assumed that the reader can do basic command-line stuff. If you can do that you must know what an editor is.

Also I saw "GNU/Liux" in the front matter.

[edit] Next section started "A compiler turns human-readable source code into machine-readable object code that can actually run."

12
kev009 9 hours ago 3 replies      
If you are doing "Linux programming" instead of UNIX programming, you are most likely doing something wrong.
19
Generalizing JSX: Delivering on the Dream of Curried Named Parameters tolmasky.com
72 points by tolmasky  7 hours ago   20 comments top 9
1
tolmasky 2 hours ago 0 replies      
In case anyone is curious, I have since posted some additional thoughts:

1. Unifying default parameters with curried named parameters: https://tonicdev.com/tolmasky/default-parameters-with-generi...

2. Using generic JSX to declaratively specify JavaScript ASTs: https://tonicdev.com/tolmasky/generic-jsx-for-babel-javascri...

2
WorldMaker 4 hours ago 1 reply      
This is a good post with a lot of ideas to stew on.

I think my only criticism of the current approach in the linked GitHub repo [1] is the use of `eval`, as that could make it tough to utilize in low security/sandboxed environments such as mobile apps and CSP-controlled sites.

I'll save attempting an actual code exercise for later, but my first thought was maybe using JS implicit `this` to your advantage (even while trying to stray away from classes) and using `this[functionName]` to index into it. (You'd still have security concerns with `this` defaulting to `window` or `global`, but you'd avoid the need for `eval` and you can take advantage of the depth of abilities of constructor functions and ES2015 class syntax.)

[1] https://github.com/tolmasky/generic-jsx

3
DougBTX 1 hour ago 1 reply      
Regarding the hack to make function calls: if JSX elements start with a capital letter, then the pragma is ignored, and the element name is used as a local variable name instead, e.g.:

 transform("<Foo id = 'my-element'>hi!</Foo>")
outputs

 Foo({id: "my-element"}, ["hi!"])

4
rook2pawn 2 hours ago 0 replies      
Very relevant https://github.com/substack/hyperx

I code in react without jsx and webpack. I recommend hyperx for your code. Hopefully new modules can be built out of this one.

5
malekpour 4 hours ago 0 replies      
Very good article, it is easy to predict great use cases for JSX concept in the future.

I proposed the idea of using JSX as angular2 templates. They didn't like it and I think the React competition was an important factor for that.

https://github.com/angular/angular/issues/5131

6
tobr 4 hours ago 1 reply      
I think you mixed some nice ideas with a very weird idea for no good reason. I kept looking for why you want to do this through JSX, but I just didn't find anything convincing.

I like the idea of using the "from" function to rename applied parameters. But that and everything else would be equally possible and a lot less confusing if it didn't involve JSX, no?

7
e_d_g_a_r 5 hours ago 1 reply      
"Dream of Curried Named Parameters" What dream, this is a daily reality in OCaml.
8
agumonkey 2 hours ago 0 replies      
Seeing 'xml' trees reminds me of a child from ML and XSLT...
9
draw_down 6 hours ago 3 replies      
A certain portion of programmers seems convinced that currying offers a huge amount of power, and we just haven't unlocked it yet. But I don't think I understand why they feel that way.
20
Decoding the N.F.L. Database to Find 100 Missing Concussions nytimes.com
70 points by clorenzo  8 hours ago   11 comments top 4
1
markrote 6 hours ago 2 replies      
Far more important, and certainly not coincidental, is that the NFL employed some of the same lobbyists, lawyers, consultants as the tobacco industry.

http://www.nytimes.com/2016/03/25/sports/football/nfl-concus...

2
thinkcontext 4 hours ago 0 replies      
Does anyone else find the timing of this and the NFL's recent surprise admission before Congress[0] interesting? I'm curious if they did that because they knew this story was coming out, undermining their science which produced uncertainty about the link.

[0] http://www.nytimes.com/2016/03/16/sports/nfl-concussions-cte...

3
hbcondo714 4 hours ago 0 replies      
Here's the NFL's response to this new york times story:

http://www.nfl.com/news/story/0ap3000000647389/article/nfl-r...

4
iaw 7 hours ago 3 replies      
While unfortunate I don't find this that surprising (or even nefarious). Getting every doctor to consistently do something outside of their normal scope is probably pretty challenging. Sometimes things can slip.

As for the Dallas Cowboys, well, they clearly just couldn't be bothered.

21
Show HN: Station307 Stream files between cURL, Wget and/or browser station307.com
21 points by hakvroot  4 hours ago   7 comments top 4
1
hakvroot 2 hours ago 0 replies      
Author here! This is a little side project I created because I found myself using scp + email to send a file from a server to someone sitting right next to me more than once. I have a different project which does relaying over HTTP so I figured it might be a good idea to reuse a part of the codebase for a simple file streaming service. I finished it yesterday and it already proved useful for myself, and I hope it also proves to be useful for you.

If there are any (more) questions on the what or how, I'm happy to answer them. The day here is however coming to an end so it might take me a couple of hours. While I'm gone, please be aware that the service is running on a $5/month VM ;).

2
prohor 3 hours ago 1 reply      
Very nice. I've bookmarked it.

By the way - there was another similar service recently, but stores files: https://transfer.sh/ , https://news.ycombinator.com/item?id=11322007

3
exhilaration 3 hours ago 1 reply      
How does this work? I see the pending POST in the Chrome network tab, which completes when someone downloads it, and then become pending again for the next download. So are you proxying the uploaded data to the downloading client?
4
disaster01 4 hours ago 1 reply      
This is very useful! How do they think to earn money with it?
22
Ask HN: Best payment processor?
46 points by Gaessaki  2 hours ago   36 comments top 15
1
johnnyg 1 hour ago 0 replies      
We've run on Stripe for 2 years now.

Good onboarding, snappy support replies by smart people, they do what they say they'll do, good transparency when there are mistakes, early heads up on changes. Overall, they are solid people running a solid business. We respect them and enjoy being clients of theirs.

We're a medium sized business and we run charges pretty regularly through out the day. Their status page only reports their larger outages. If you are running charges regularly, expect their end points do go down for 20 secs up to 2 minutes 2-3 times a month. Its to the point where our customer service chat knows what's going on and says things like "ask the customer to wait 2 minutes, it'll be right back". Its frustrating. We've considered setting up a fail over with spreedly and braintree but its juuuust inside the threshold of annoying enough to do all that.

That minor gripe said, I still have the Stripe afterglow because we suffered through the Auth.net days and the AMEX domination days. Stripe set us, and everybody else, free from that jazz. We're still grateful and probably always will be.

Lastly I note, if your volume merits, they will discuss alternative rates within reason.

You should go with Stripe.

2
eldavido 59 minutes ago 0 replies      
I'm in the middle of redoing the card processing system for a hotel property management system.

I'd consider using Stripe and have in the past, but we're doing millions/month USD (less than you'd think with a high-dollar, low-margin business) almost entirely card-present, now with EMV, which puts us way outside Stripe's target market. We also have a complex approval process and capture/settlement several days after approval, which, again, isn't really in stripe's wheelhouse.

Don't use heartland. I've had to deal with them a lot over the past month and they're absolute garbage. Shit API, integration specialists that can't be bothered, NDAs before they'll look at you, just all-around bad experience. They also announced they're deprecating SHA-1 support, which would be prudent except that they gave merchants two weeks notice before doing it.

3
palidanx 13 minutes ago 0 replies      
I've integrated Braintree for monthly recurring subscription services and for digital product purchases. For my own purposes, I built my own cart in Rails as I needed a degree of flexibility in check-out.

So far I haven't run into any problems. The only thing I really needed was to customize customer e-mails, so I used Braintree's webhooks to my server to send out my own e-mails.

Support wise, I was on the phone with them quite a bit in the beginning, and they were nice and knowledgeable. I used them pre-paypal and post-paypal i haven't noticed any differences.

Fraud wise, my customers tend to be reliable ones so I haven't had to worry about fraud yet.

4
wuliwong 2 hours ago 3 replies      
I switched to Braintree after having some trouble with the initial integration of Stripe in a Rails application I made about 9 months ago. I am an experienced Rails developer and found the Stripe documentation to be a bit out of step with "the Rails way" and a little incomplete. I switched to Braintree and was up and running with relative ease.

I can't speak to my experience as a Braintree customer as sadly my app hasn't processed many payments but the initial integration using Braintree was much easier for me.

5
chrisgoman 2 hours ago 1 reply      
Your details show that you know your business well. What exactly does high volume of payments mean? What does low margins mean? With credit card processing, it is about specifics. The magic numbers are:

1) Dollar amount processed by month

2) Average dollar amount per transaction

From there, you can figure out the rest

For startups, the default is ALWAYS Stripe (IMHO) because you can get to processing cards right away - like in 5 minutes. Their API is easy, their virtual terminal works as expected for manually keyed in transactions. There is even a super dumb screen that is just HTML (Checkout) if you don't even want to deal with an API.

As far as the fees, the $0.30 per transaction fee is always going to be there. The 2.9% percent is negotiable, specially if you are doing volume. If you are doing $1,000/mo, 2.9% will be your rate.

If you are doing much higher volume, that rate goes down. For example, if you can show (with proof via bank statements or from your current merchant provider) something like $300-400k per month for the last 3 months, I was able to get them very close to what is called the "interchange rate" which is the lowest rate you can get even with traditional processors like Moneris (FirstData). My current "effective rate" is about 1.6-1.7% (effective rate is my quick math of fees/total).

Like somebody already said, micropayments are pretty bad.

High volume + low margins make it sound like you are operating a restaurant where the fees are much more different due to the physical nature of the business (less fraud due to card present) vs internet (card not present)

6
flurdy 1 hour ago 0 replies      
In my previous uk startup for an online game (100 mill players) we used multiple providers. Vindicia, Playspan and Paypal as well as physical shop based game time cards. And then also later directly with Apple's appstore and Google's Play store.

You might find yourself requiring multiple providers for different requirements and redundancy.

Playspan was for example very good at the south american market, Vindicia very good at retrying failed payments and auto replacing card details for recurring subscriptions. And a lot of customers expected to be able to pay with Paypal whom are also good at fraud prevention.

But to echo everyone here, if my "next" startup requires a payments provider I would initially go with Stripe and/or Braintree.

7
joshjkim 1 hour ago 1 reply      
not sure if low margins = smaller $$ value for each payment or just small % your taking, but Paypal offers a micro-transaction fee rate: 5% + 5 cents (USD). For transactions under $12, this is usually preferable to the standard rates from Paypal or Stripe. If transaction size goes above and below that, you can set-up two separate accounts and direct payments above a threshold to one account and payments below the threshold to another account. we do that at my company, and it saves us LOADS in fees.

Last I checked, no other service has any separate fee for micropayments. Paypal doesn't make it easy to find this info out, but it definitely still offers the product.

of course paypal can be really, really annoying to work with, as their code is VERY old and their APIs can be confusing. They also have a weird variety of products that semi-overlap, so selection of which specific product to use can be confusing (Express Checkout vs. Instant Checkout vs. Adaptive payments..?).

a few other things to note: paypal's international coverage is much better than any other provider, so if you want to expand to EU or South America quickly, they are good for that. also, paypals payout process (MassPay) is much easier to use than ACH or any other solutions I've seen, and much cheaper (2% transaction fee for payouts). Also, all the payee needs is an email address, no bank account info, etc.

Truthfully, those are two reasons we still use Paypal. Otherwise, it kinda sucks =)

8
Cieplak 44 minutes ago 1 reply      
Just to list a few:

- Stripe

- Braintree

- Adyen

- Paypal

- Wells Fargo Merchant Services

- Chase Paymentech

- Vantiv (acquired Litle)

- Forte (acquired ACH Direct)

- WorldPay

- Skrill

- Moneris

- Coinbase

- AMEX

- Auth.net

Stripe is probably your best bet, but should you decide to choose anyone else, let me know if you'd like any help as I've integrated with most of these processors before :)

PS: Ultimately your decision should come down to your card blend, i.e., if you process mainly debit cards (Durbin regulated and basically zero interchange), go with a processor who will offer you interchange-plus pricing. If you process mostly premium cards, go with a processor who will offer you a blended rate, since they'll probably be eating their AMEX transaction fees (typically around 3.5%).

9
supster 2 hours ago 0 replies      
I'm currently running a Node.js/Express.js/MongoDB service with a Bootstrap/jQuery frontend and iOS + Android clients. My payments mix also tends to be high volume and low margin. I have been extremely satisfied with Stripe - they have a solid api, a great npm package[1] to interface with their api, thorough documentation with examples in node[2], and a wonderful dashboard. I highly recommend them.

1) https://github.com/stripe/stripe-node2) https://stripe.com/docs/api/node#intro

10
coreyp_1 2 hours ago 1 reply      
We found Stripe to be the easiest and most affordable to implement. $.30 + 2.9% processing fee per transaction, though.

Micropayments are tough (if that's what you're going for).

11
zurbi 2 hours ago 0 replies      
Usually I recommend Avangate, Bluesnap, Cleverbrige or Fastspring. The best part is that with these services there is nothing to configure. You just link to your page on their servers, they even theme it for you.

Downside: The cost is somewhat higher. So I don't think they work for a low margin business.

12
joeld42 1 hour ago 0 replies      
I will echo the recommendations for Stripe. I used it for a side project that needed to process payments for enrichment classes organized by our school's PTA, and have had zero problems over the last two years, processing around $150k of payments.
13
hotpockets 2 hours ago 1 reply      
Could you use an ACH solution? I guess consumers may not be familiar with it, but stripe ACH looked customer friendly. I think they charge 0.8%.
14
contingencies 2 hours ago 0 replies      
If any payment processor is interested in having someone who knows the ins and outs help them implement IBAN-based bank transfers for customers in SEPA or other regions really well, let me know.
15
ckorhonen 1 hour ago 0 replies      
Braintree
23
Do you have the brains for cybersecurity? bbc.co.uk
61 points by pelf  9 hours ago   17 comments top 9
1
tshadwell 8 hours ago 3 replies      
What area of 'cybersecurity' would I be finding myself breaking substitution ciphers based on wingdings in?

I work in the information security industry, and I feel like I'm missing something but I really have to ask what these are relevant to.

Cryptography, which this appears to be a reduced form of is mostly tangential and very nuanced relative to the ciphers in this challenge. I often feel my line of work is grossly misrepresented by dizzying fields of esoteric numbers and references to ancient cryptography when I'm happy to find myself many of my days engrossed in the security characteristics of some powerful technology used right now in the real world.

I moved from engineering to security, but if this was my only interaction with security, I'm not sure I'd have been interested.

Edit: if you're interested in real crypto challenges, try http://cryptopals.com/ and read Cryptography Engineering, which is a wonderful read that goes over not only the cryptography but also the principles common across the many specialisations of the infosec industry

2
AdmiralAsshat 3 hours ago 0 replies      
I wasn't aware I had to explain how the crypto works in order to advise my clients that they should be disabling outdated SSL versions on their servers and returing RC4 ciphers.

Evidently I don't have the brains for cybersecurity. My clients should be just fine with their telnet-enabled/remote-root-accessible servers until someone who can descramble Wingdings riddles can save them.

3
patcheudor 7 hours ago 0 replies      
I have mixed feelings about this. While being a good puzzle solver is important, to be really good you need a certain level of creativity in thinking which goes beyond just the ability to solve puzzles. Thinking like a criminal as an example is a necessity in a number of cyber-security fields and can trump the ability to solve puzzles. I see a lot of vulnerabilities get marginalized because people simply can't correlate how it could be used by a criminal to make money. Likely for a reason, it's the ability to think like a criminal which is largely missing & where people do have that ability many times they are treated by their cyber security peers as a bit suspect.
4
AndyMcConachie 3 hours ago 1 reply      
This is probably a recruitment operation. Not that there's anything wrong with that, but I think that's what this is.
5
zubspace 9 hours ago 0 replies      
If you enjoy this, maybe you will like the challenges of Hacking-Lab (https://www.hacking-lab.com).

Right now there's a Hacky Easter competition running which you can participate in for free: http://hackyeaster.hacking-lab.com/hackyeaster/challenges.ht...

6
merpnderp 8 hours ago 0 replies      
This might be the optimal place to start (Khan academy's excellent intro): https://www.khanacademy.org/computing/computer-science/crypt...
7
Moppers 4 hours ago 0 replies      
I can't do one of these. It's the middle one of the last part. The diagram with the pentagon.
8
terminado 7 hours ago 0 replies      
No[1], because "cybersecurity" is an open-ended non-static target, with human adversaries in the loop, who will adapt to circumstantial changes dynamically.

 [1] https://en.wikipedia.org/wiki/Betteridge's_law_of_headlines

9
mtgx 9 hours ago 1 reply      
What's the point if they're just going to ask for backdoors in those systems later?
24
Show HN: Watch movies with the freedom to filter github.com
113 points by marco1  8 hours ago   97 comments top 24
1
callumlocke 5 hours ago 6 replies      
Somehow this makes me really uncomfortable. I know it seems very reasonable, inviting people to choose what they want to censor for themselves... but it just feels wrong. I'd much rather someone didn't watch my film at all than watch a butchered version. As a separate point, I think it's often more traumatic/damaging to cut away from a scene just as it's getting distressing, and then cut back after the bad thing has happened. I often had this kind of viewing experience imposed on me as a child, and I honestly think it screwed me up a little, sitting there speculating about whatever I'm missing, and generally being taught that when things are getting unpleasant you just skip that part. Not a good attitude for life. It's true there are films that are inappropriate for a young kid, but in that case, choose a different film.
2
bosdev 6 hours ago 3 replies      
I dislike that the filters are strictly defined. What if I wanted to create a category for 'rape' or 'animal-cruelty'? I should be able to add those notes, and have them be silently ignored by unsupporting clients. Even better though, I don't see why the client wouldn't include those new categories in what can be filtered for this specific clip.

Edit: Also, it seems like the severity levels (low/medium/high) are arbitrary, meaning they will vary from film to film. It might make more sense to give categories sub-categories or tags. So for swear words, it would have the sub-categories of all the various words which might be used. For violence, it might have the tags of ('simulated', 'on screen', 'bloody'), etc. That way the viewer could decide what they want to see, and it would be somewhat consistent.

3
Eric_WVGG 5 hours ago 1 reply      
This sounds like complete bullshit to me, and I find the the entire idea pretty reprehensible. But on the off chance that this helps someone whom is a survivor of sexual assault and buys into the idea of trigger warnings, there perhaps ought to be a differentiation between "sex" and "sexual violence" or "rape"
4
gdw2 6 hours ago 4 replies      
Aside from filtering based on moral/taste/etc grounds, I've often thought it would be interesting to filter for the sake of time. Could you conceivably trim a 3hr movie down to 90 minutes and have it still make sense? It would be an interesting experiment.
5
compiler-guy 4 hours ago 2 replies      
I doubt I'll use this technology myself--I generally prefer to see movies as they were created--but recuts and derivative works can be works of art in and of themselves.One could have use this technology to create "Star Wars--the Phantom Edit", for example.

https://en.wikipedia.org/wiki/The_Phantom_Edit

Production companies often create airplane and television versions of movies, and it isn't any big deal. I see no reason why customization of this sort is immoral since no one is required to use it.

6
9999 2 hours ago 1 reply      
A tool to "filter" a work of art for yourself is also a tool to censor a work of art for someone else. In this case, and judging by the reactions from some parents here, most likely for your own children. So, despite the interesting framing of this as something that will create the "freedom to filter," it inevitably will be used to oppress.

Regardless, the end effect on the viewers (both willing and unwilling) is that they have not seen the work. They may in fact have a completely different understanding of the film or show than someone who actually has seen the work. A morally ambiguous character becomes wholly righteous. A villain becomes the hero. Violent acts cease to have a consequence of gore, perhaps of pain or death. Romance no longer leads to sex. Dialogue cut for swearing completely changes the perception of a character, the story, and the world.

In the end you have seen something worse than nothing at all--a cartoon caricature of what the creators intended. You have wasted your own time, perverted the aims of the artists behind the work, and done your children a disservice.

7
gdw2 6 hours ago 0 replies      
Is there an existing collection of free filters for popular movies anywhere? What would it take to get this working with streaming services (like VidAngel does -- I think they pull from Google Play, but not sure)?
8
vdwijngaert 6 hours ago 0 replies      
Makes it way easier to directly skip to the raunchy scenes! ;-)
9
kempbellt 3 hours ago 0 replies      
First thing that came to mind: Game of Thrones.

I imagine episodes would be 10 minutes long and you'd have no idea what's happening. Not saying they don't go over the top at times, but the brutality of a show like GoT is a very powerful illustration tool, and filtering out bits and pieces because they make you uncomfortable would ruin the entire purpose of the show.

I'm sure some people will find use for this tool, but I will likely forgo.

10
rmc 4 hours ago 1 reply      
Interesting idea. However there is nothing for racism/sexism/homophobia/transphobia, so I added a pull request for it[1]. It does seem like a weirdly specific list of categories. Why not let people filter whatever they want?

[1] https://github.com/delight-im/MovieContentFilter/pull/2

11
jasonkostempski 5 hours ago 0 replies      
I only things I find offensive are commercials and Kardashians, will this work on those?
12
rrowland 2 hours ago 0 replies      
Maybe fewer people would be complaining about opt-in filtering if the thread were titled "Watch movies with the freedom NOT to filter". This protocol would actually help those that want to watch their TV unfiltered because because it could replace the current system of filtering being forced on everybody.
13
shepik 3 hours ago 1 reply      
Niice. So if somebody were to mark all non-action scenes (like dialogues) in action movies as "language", i' be finally able to watch some real action. Nice.
14
scottjad 5 hours ago 0 replies      
Are there some tools that make it easy to make these filters? It would be nice to have something built-in (as a plugin?) to mpv or VLC.

I've created several filters for personal use with EDL. mplayer has a decent UI way of doing that while watching the film by pressing 'i' to start a cut and 'i' again to end the cut. It's nowhere near perfect though because you still have to edit the edl file to decide whether to cut video or mute audio.

One nice feature you could add to this editing UI would be to use the subtitle file to allow jumping to the locations of certain language in order to edit them (times in subtitles aren't enough to edit a single word).

15
gedy 3 hours ago 0 replies      
This is very handy, sometimes there's a really enjoyable movie you'd like to watch at home, but has few parts that are not appropriate for kids. I don't see the need for a high-horse about censorship in this case.
16
al2o3cr 3 hours ago 0 replies      
On the flipside, reverse the sense of how that tool generates EDL and you've got a "shorten Hollywood blockbusters to the good bits" system... ;)
17
calcsam 7 hours ago 1 reply      
https://www.vidangel.com/ does this as a service
18
confusedLearner 4 hours ago 0 replies      
I thought it would use deep learning to do this. Silly me..
19
tsunamino 5 hours ago 0 replies      
Hi! I have a similar project called Feerless that provides crowd-sourced, preemptive notifications for Netflix at feerless.us Would love to chat about a possible collaboration.
20
johnloeber 7 hours ago 1 reply      
How does this work?
21
robraven 6 hours ago 1 reply      
Really cool idea!
22
jcoffland 6 hours ago 6 replies      
This is basically a platform for movie censorship. Who gets to decide which parts of what movies are, violent, sexual, etc. I know I don't have to use it but regardless I hope this does not catch on. If you don't like the content in movies you don't have to watch them.
23
smacktoward 5 hours ago 3 replies      
Freedom to filter? "Freedom to deface a work of art" is more like it.

A work of art is an artist trying to tell you something. Chopping out bits based on your petty personal prejudices is like going through the Louvre and drawing clothes on all the nudes with a Sharpie. Even if you're the only one who ever has to see the defaced version -- and you can't tell me that your "freedom to filter" won't eventually start being forced down the throats of other people, like children in schools -- who gave you the right to make the decision on the artist's behalf that the defaced version doesn't compromise their work?

If you don't like art, you have always had the only freedom that matters, which is to choose not to look at it. You can live in your bubble and never risk offense if you want to; you just have to be willing to have the courage of your convictions and forgo the things you oppose. "Filtering" is just a way to try and have it both ways, to have your cake and eat it too. It's every bit as venal and cowardly today as it was when Thomas Bowdler (https://en.wikipedia.org/wiki/Thomas_Bowdler) was doing it in the 19th century.

Art is what it is. If you don't like it, don't pollute it by trying to "fix" it; leave it alone and maybe try making some of your own that meets your standards.

24
etjossem 1 hour ago 0 replies      
Movie ratings and content warnings don't hide anything. They simply let the viewer make an informed decision up front, with full knowledge of what to expect. If you're worried about desensitizing your kid to violence at an early age, there's nothing unethical about refusing to put on a violent movie until they're older.

But this project isn't a way to be more informed about content. It's an expurgation engine.

And that's way more insidious. Run "The Godfather" through a violence filter, and the viewer can no longer comprehend the stark contrast between the patriarch Vito and his violent son Michael. Tell the filter you're uncomfortable with death, and you'll never see Mulwray's body come out of the river in "Chinatown" - a critical missing piece taken from an already complex and nuanced plot. You're doing your kid a huge disservice by passing either of those off as the real thing.

Not every problem needs software to solve. Wait a few years, watch the genuine article together, and talk about it afterwards.

25
Apples First Foray into Original TV Is a Series About Apps nytimes.com
36 points by jackgavigan  4 hours ago   41 comments top 8
1
mmanfrin 3 hours ago 5 replies      

 Apple announced on Thursday that it was working with the entertainer Will.i.am
Welp, this is going to be a trainwreck.

2
bishnu 3 hours ago 5 replies      
I find the concept of a TV show that can only be seen on certain devices and OSes to be sad.

I guess we've been trending in this direction for a while (no Amazon Prime Streaming app on Apple TV, for instance).

3
kingnight 2 hours ago 1 reply      
Ew. This sounds very tasteless.

How about doing something creative and that's not navel gazing or better yet use this money on OS X.

4
ismail 49 minutes ago 0 replies      
Will.i.am got Apple boom boom pow.
5
robertwalsh0 2 hours ago 2 replies      
Why can't Apple just stick with what it's good at?
6
johansch 1 hour ago 0 replies      
If this isn't Apple jumping the shark I don't know what it would be....
7
samstave 3 hours ago 0 replies      
I'm reminded of the incredibly bad "Silicon Valley" (not the HBO one, which is amazing - but the one by Randi Zuck...) and how REALLY REALLY bad that was - I doubt Apple will fail that bad, but we will see.
8
askyourmother 2 hours ago 1 reply      
If they want ideas for shows, how about life at uni, in the us, for the year 2016. Over here in Europe, we either get the view that it is like back in revenge of the nerds movies, or worse, stuff like this:

Http://www.zerohedge.com/news/2016-03-24/emory-students-scared-pain-after-safe-space-violated-word-trump-written-chalk

Hopefully it is not like revenge of the nerds in 2016, and hopefully it is also not full of students crying in "safe" spaces.

Might even be interesting!

26
Markov Chain Monte Carlo for Bayesian Inference The Metropolis Algorithm quantstart.com
70 points by shogunmike  12 hours ago   8 comments top 3
1
Houshalter 7 minutes ago 0 replies      
I'm sad that every explanation of MCMC uses complicated symbolic explanations. I was trying to explain it once and came up with a fairly intuitive way to visualize it. Probably that is the way it was first discovered, though who knows.

In words, you can visualize it as doing a random walk around the area of a probability graph. Then it just turns into an explanation of why random walks accurately sample from an area, and the various tricks used in MCMC methods to make random walks more efficient.

Relatedly, bayes theorem makes much more sense when you visualize it, e.g. this post: https://oscarbonilla.com/2009/05/visualizing-bayes-theorem/

2
warrenmar 6 hours ago 1 reply      
Not to hijack this post, but I'm saddened every time I see anything monte carlo related referred to as the metropolis algorithm. When you're the boss, what you say goes.

http://andrewgelman.com/2014/06/30/invented-metropolis-algor...

3
constantlm 3 hours ago 2 replies      
This is so incredibly far over my head.
27
Left-pad as a service left-pad.io
802 points by manojlds  20 hours ago   233 comments top 50
1
c4n4rd 12 hours ago 4 replies      
This is really exciting!!! I was a bit disappointed that the right-pad will be out only in 2017. I am looking forward to that release because there is a high demand for it now.

What kind of load balancing is being used on the back-end?I called leftpad(str, ch, len) with the length I needed and noticed that is not very scalable because it is blocking.

A better approach I would recommend to those using it is to call the API in a for loop. In my tests, it had performance very close to those I see in C or assembly.

I was a bit turned off that the free version can only handle strings up to 1024 in length. I know you need to make some money, but it is big turn off for a lot of my projects.

Edit: I finally signed up for it but still noticed that I am only allowed to use 1024. I called your customer support line and they said I was calling the API from multiple IP addresses and for that I need an enterprise license. Please help me with this issue, it is very crucial at this point as my project is in a complete stop because of this.

2
pilif 12 hours ago 1 reply      
As a very sarcastic person, I highly approve of this. This absolutely reflects my opinion of all this mess.

Thank you for making this site so that I don't have to write an opinion piece like everybody else seems to have to. Instead, if asked about the issue, I can just point them at this site.

Yes. This isn't constructive, but this mess had so many layers that I really can't point to a single thing and offer a simple fix as a solution.

As such I'm totally up for just having a laugh, especially when it really isn't being nasty against specific people but just laughing about the whole situation.

Thank you to whoever made this

3
faizshah 12 hours ago 7 replies      
I don't understand why this community has to have a weekly cycle of bashing different programming communities. Every week there's a new drama thread bashing Java devs, Go devs, Javascript devs etc. The thing that I come to this community for every week is to read about new developments in our industry, if you don't come here for that then what are you coming here for?

And wasn't it just a few months ago people were praising the innovation of Urbit for having a 'global functional namespace'? But because it's popular to hate on javascript devs for applying -- sorry, I forgot this was javascript bashing week -- for reinventing concepts from other areas in computer science and software engineering the HN community has to start hating on another programming community's work.

That said this is a pretty funny satirical page, apologies to the author for venting at the HN community.

4
pka 6 minutes ago 0 replies      
The real discussion is not about package managers, micromodules, or whatever.

It's about "real programmers can write left_pad by themselves" and everybody else just sucks. True scotsmen etc.

Now I don't know why asm people arent feeling threatened and aren't attacking the C left_pad gurus yet...

5
mschulze 7 hours ago 4 replies      
As a Java developer, I am a bit jealous. When people joke about us we usually only get a link to the Spring documentation of AbstractSingletonProxyFactoryBean (or maybe the enterprise hello world), but no one ever wrote that as a service. Maybe someone can do that? https://abstractsingletonproxyfactorybean.io seems to be available!
6
supjeff 12 hours ago 3 replies      
7
sentilesdal 2 minutes ago 0 replies      
for the graphic designers out there who need left-pad, blue-steel-pad is now available.
8
nogridbag 7 hours ago 2 replies      
I'm late to the left-pad discussion. I thought it was considered a bad practice to depend on an external repo as part of your build process. At my company we use Artefactory to host our maven libs. Even if one were removed maven central our builds would continue to work fine (in theory).
9
gumby 3 hours ago 0 replies      
HELP!! The CEO heard about this new service and now my manager told me we need to upgrade all our packages to this new service ASAP! But there's nothing on stack overflow I can use to change our system! I need to get this pilot done STAT so we can plan the migration and send it out for bid!

HELP!!

10
andy_ppp 12 hours ago 23 replies      
Hahaha - isn't it hysterical how everyone using npm for small reusable code pieces! Aren't they morons! How stupid of people to trust their package manager to be consistent and correct and return packages they were expecting.

How stupid of people to reuse small often used functions that only do one thing well.

How does everyone taking the piss intend to protect themselves from this in their OS package manager, or PPM or composer or pip?

It's not javascript devs fault that the standard library is so piss poor you need these short code snippets and I've definitely included small 5-10 line packages via npm or other package managers rather than roll my own because it's likely they have bug fixes I haven't considered. I can also use npm to share these snippets between the many projects I'm building.

* No I wasn't affected by this because I review the packages that I want to include, however the level of smugness here is absolutely ridiculous.

11
icefox 11 hours ago 1 reply      
Nice, it is even bug compatible

http://api.left-pad.io/?str=foo&len=7&ch=12

return {"str":"12121212foo"}and not {"str":"1212foo"}

12
huskyr 8 hours ago 1 reply      
Reminds me of Fizzbuzz enterprise edition: https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...
13
jaxondu 12 hours ago 0 replies      
It's 2016! Left-padding without any deep learning algorithm is so lame.
14
Flott 4 hours ago 1 reply      
I'm desperately looking for some feedback from big users.

- Does it scale well?

- Is it pragmatic for long term use scenario?

- Is it Thread safe?

- Does it learn from previous call made to the API?

- Does it have a modern access layer?

- Does it enforce 2nd factor authentication?

- Is it compatible with Docker containers?

- What about multi-region scenarios?

- Any benchmark available showing usage with AWS + cloudflare + Docker + a raspberry pi as LDAP server?

15
jeffreylo 12 hours ago 2 replies      
Doesn't work with unicode characters:

# ~ [8:47:18]$ curl 'https://api.left-pad.io/?str=&len=5&ch=0'{"str":""}%

16
andrepd 8 hours ago 0 replies      
>`left-pad.io` is 100% REST-compliant as defined by some guy on Hacker News withmaximal opinions and minimal evidence.

Wonderful

17
beeboop 10 hours ago 2 replies      
Tomorrow: left-pad.io announces $120 million investment at $1.2 billion valuation

Month from now: left-pad announces purchase of $170 million office building in SV to house their 1200 employees

18
rfrey 6 hours ago 2 replies      
I'm very disappointed in the creators' choice of font for their landing page. Practically unreadable, my eyes burned.
19
stared 6 hours ago 1 reply      
I am waiting for "integer addition as a service" (vide http://jacek.migdal.pl/2015/10/25/integer-addition.html).
20
Jordrok 7 hours ago 0 replies      
Very nice! Any plans for integration with http://shoutcloud.io/ ? I would love to have my strings both left padded AND capitalized, but the APIs are incompatible. :(
21
a_imho 8 hours ago 0 replies      
My gut feeling tells me serious software engineers who look down on javascript programmers are feeling justified now. Brogrammers are exposed, hence the lot of knee jerk. Indeed, it is pretty funny, but dependency management still remains a hard problem.
22
maremmano 12 hours ago 1 reply      
What about splitting left and pad in two microservices?
23
jug 1 hour ago 0 replies      
If left-pad.io goes down, will it take the rest of the WWW infrastructure with it? I'm missing a Q&A for important and apparently relevant questions like these.
24
yvoschaap2 12 hours ago 0 replies      
While very useful SaaS, I always use the tweet package manager from http://require-from-twitter.github.io/
25
sansjoe 3 hours ago 0 replies      
A programmer is someone who writes code, not someone who installs packages. Do you really need someone else to pad strings for you? Come on.
26
p4bl0 11 hours ago 0 replies      
As a friend said on IRC, it's kind of sad that the website is not made with bootstrap.
27
bflesch 5 hours ago 0 replies      
I get an error

 {"message": "Could not parse request body into json: Unexpected character (\'o\' (code 111)): was expecting comma to separate OBJECT entries\n at [Source: [B@6859f1ef; line: 2, column: 22]"}
when using double quotes. It seems some JSON parsing fails. Not sure if this can be exploited, so I wanted to let you know.

Demo link: https://api.left-pad.io/?str=%22;

28
TickleSteve 12 hours ago 1 reply      
Presumably this is using a docker instance on AWS or the like? </sarcasm>

BTW: Well done... nothing like rubbing it in. :o)

29
chiph 12 hours ago 1 reply      
Needs more Enterprise. Where are the factory factory builders?
30
danexxtone 12 hours ago 1 reply      
Where do I sign up for alpha- or beta-testing for right-pad.io?
31
ChemicalWarfare 9 hours ago 1 reply      
BUG! (I think)using '#' as a 'ch' value pads the string with spaces:

$ curl 'https://api.left-pad.io/?str=wat&len=10&ch=#'

{"str":" wat"}

Please provide github link to fork/submit pr :)

32
creshal 11 hours ago 1 reply      
I need a SOAP binding for this, because reasons.
33
Mopolo 6 hours ago 0 replies      
That would be fun if a company named Left Pad asked to get this domain like Kik did at the beginning of all this.
34
talideon 9 hours ago 0 replies      
But the question is, is it enterprise-ready? :-)
35
idiocratic 9 hours ago 0 replies      
Are you hiring?
36
dkackman1 2 hours ago 0 replies      
SECURITY NIGHTMARE!!!!!!!!!

Without any sort of nonce, this service is trivially susceptible to a replay attack

37
mrcwinn 4 hours ago 0 replies      
They didn't even think to version their API. This is total crap.
38
nyfresh 7 hours ago 0 replies      
100 times on the boardhttp://bit.ly/1RzOIK2
39
MoD411 11 hours ago 0 replies      
Boy, that escalated quickly.
40
ritonlajoie 11 hours ago 0 replies      
I'm looking for a Left-pad specialized linux distro. Anyone ?
41
cmancini 11 hours ago 1 reply      
This is great, but I'll need an API wrapper package.
42
facepalm 9 hours ago 1 reply      
Cool, but it would be more useful if they had a npm module for accessing the service.
43
jdeisenberg 12 hours ago 0 replies      
Have we, or have we not, officially entered the silly season?
44
yyhhsj0521 11 hours ago 0 replies      
I wonder whether the author uses leftpad on this site
45
venomsnake 12 hours ago 0 replies      
I don't think this is enterprise ready. And I am not sure that they are able to scale their service. Left padding is serious business.
46
markbnj 8 hours ago 0 replies      
This is awesome. Props to you all.
47
justaaron 4 hours ago 0 replies      
this is hilarious and timely
48
d0m 5 hours ago 1 reply      
I'm ready to get downvoted to hell with this comment but here we go..:

I feel like only non-javascript devs are bashing against small modules and NPM. All great javascript devs I know LOVE that mentality.

Let me offer some reasons why I (as a current Javascript dev having professionally coded in C/C++/Java/Python/PHP/Scheme) think this is great:

- Unlike most other languages, javascript doesn't come with a battery standard library. So you're often left on your own to reinvent the wheel. I mean, common, in Python you do "'hello'.ljust(10)" but AFAIK there isn't such thing in javascript. Javascript is more like the wild west where you need to reinvent everything. So having well tested libraries that does one thing extremely well is really beneficial.

- Javascript, unlike most other languages, has some pretty insane gotchas. I.e. "'0' == 0" is true in javascript. Most devs have been burned so bad in so many ways in Javascript that it's comforting to use a battle-tested library, even for a small feature, rather than reinventing it.

- And anyway, where should we put that function? Most big projects I've worked on have some kind of "helper file" that has 1500 lines, and then at some point different projects start depending on it so noone likes to touch it, etc. So, yeah, creating a new module takes a bit more time, but remember that it's not about the writing time but more about the maintenance time. I'd much rather have lots of small modules with clear dependencies than a big "let's put everything in there" file.

- I feel arguing about whether something should be in a separate module is similar to arguing without something should be in a separate function. For me, it's like hearing "Hey, learn how to code, you don't need function, just write it when you need it." And hey, I've worked in projects professionally where they had no function and it was TERRIBLE. I was trying to refactor some code while adding function, and people would copy my function inside their 1500 lines file. Let me tell you I left that company really fast.

- It's fair to say that UNIX passed the test of time and that the idea of having lots of small programs is extremely beneficial. It forces common interface and great documentation. Similar to how writing test force you to create better design, modularizing your code forces you to think about the bigger picture.

- As far as I'm concerned, I really don't care whether a module is very small or very big, as long as what it does is well defined and tested. For instance, how would you test if a variable is a function? I don't know about you but my first thought wasn't:

 function isFunction(functionToCheck) { var getType = {}; return functionToCheck && getType.toString.call(functionToCheck) === '[object Function]'; }
Who cares if it's a 4 lines module. I don't want to deal with that javascript bullshit. Yes, I could copy past that in my big helper file, but I'd much rather used one that the javascript community use and test.

- Finally, it seems like Node/javascript hasn't started that way. Not so far ago with had Yahoo monolithic javascript libraries and jquery. Even the first versions of most popular node library (such as express) were first written as a monolithic framework. But it's been refactored into dozen of small modules with clear functions. And now, other libraries can just import what they need rather than the whole project.

OK, so I told you about the good thing. What about the bad thing?

- Adding dependencies to a project is REALLY HARD TO MAINTAIN. I've had so many bad experience using node because of that. I.e. I work on a project, it's tested and work fine. 2 months later I clone and start the project and everything breaks. Oh, X and Y libraries decided to fuck everything, that other library now depend on a new version of Node, but I can't upgrade node because that other library depend on a previous version of Node. It's complex. I won't go on in explaining my solution to this problem, but enough to say that it's a problem and installing random amateur libraries in a professional project can lead to disaster.

- It takes longer to code. I've touched that earlier. It's a tradeoff about write now vs maintain later. Take a look at segmentio github repo: https://github.com/segmentio. I'd personally love to have that as onboarding experience rather than some massive project with everything copy/pasted a few time. But yes, it took them more time to create those separate modules.

49
jschuur 10 hours ago 1 reply      
Is it rate limited?
50
shitgoose 6 hours ago 0 replies      
this is fantastic! What is your stack? Are you NoSQL or relational? Redis? What is your test coverage? I am sure you hiring only trendy developers. I see huge potential in your service, do you accept private investments? I would like to get in now, before Google or YC snatches you! again, keep up good work and - can't wait for right-pad next year!
29
What Prominent Roboticists Think Google Should Do with Its Robots ieee.org
73 points by mrfusion  13 hours ago   56 comments top 9
1
sounds 9 hours ago 4 replies      
I think Robin Murphy, Texas A&M Professor, has the right suggestion for Google. No idea if Google will actually do it.

> If I were a large company with deep pockets, I would accept that robotics is what is called a formative market and just like shopping on the web, it will take a decade or so to ramp up. Robots are innovations that do not directly replace people and thus it is hard for people to imagine how to use themthus the market is formative. The theory of diffusion of innovation indicates that potential end-users, not developers, need to experiment with applications to see what works (and not just physically but human-robot interaction as well).

> However, robots have to be extremely reliable and the software customizable enough to allow the end-users to tinker and adapt the robots, so that the end-user can find the killer app. Essentially, you can crowdsource the task of finding the best, most profitable uses of robots, but only if you have good enough robots that can be reconfigure easily and the software is open enough. I would concentrate on creating generic ground, aerial, and marine robots with customizable software and user interfaces in order to enable their regular employees (and customers) to figure out the best uses.

> In order to make sure the development was pushing towards the most reliable, reconfigurable, and open robots possible, I suggest the developers focus on the emergency response domain. Disasters are the most demanding application and come in so many variations that it is a constant test of technology and users. Imagine the wealth of ideas and feedback from fire rescue teams, the Coast Guard, and the American Red Cross, just to name a few! Focusing on emergency management would also have a positive societal benefit.

2
Animats 6 hours ago 0 replies      
From the article: "What are those guys up to?"

The answer seems to be "not very much". Google bought all those robotics companies but didn't get them to work together. All the companies are still in their original locations. Nobody has a product. Even Bot and Dolly, which had a product, no longer seems to be selling it. Boston Dynamics is being sold because they don't play well with others.

The whole robotics exercise seems to have been a hobby of Andy Rubin, and when he left, nobody had a clue what to do. Google/Alphabet, through mismanagement, may have added negative value to the robotics industry. Google's secrecy here seems to be more about hiding management failure than protecting intellectual property.

Martin Buehler, the brains behind BigDog, is now at Disney, and I expect we'll see more mobile robots there. Disney has wanted this for years; around 2000, they hired Danny Hillis to work on theme park robots. They got some improvements to their anamatronics, and a dino robot that pulls a cart but gets its balance from the cart wheels.

The next killer app in robotics is probably really good bin-picking. Kiva's mobile platforms can bring the shelf to the picker, but a human still takes the thing out of one bin and puts it into another. Amazon is working on solutions to that.

3
simonh 9 hours ago 4 replies      
Hardly any of the problems they suggest robots can solve are problems Google faces though. Last 10 metres delivery, supply chain control and management, disaster relief work, oil and gas, home assistants - particularly for seniors. There are too many comments in there to address all of them but here's a few:

"If anyone could crack the indoor social robot market that is seeing such high interest right now in both the consumer and commercial spaces, it would be them"

Why on earth would Google care about the indoor social robot market?

"If I were a large company with deep pockets, ... it will take a decade or so to ramp up"

It's always easy to say how other people should spend their money, and just because a company is one of the few that can do a thing, it doesn't follow that they should do it. A lot of the speculation seems to be of the "if I had a billion dollars, I would..." wish fulfilment sort.

I can see the point of self driving cars. They were doing street view anyway and that's a pretty obvious application for autonomous vehicles. But they're not actually a manufacturing or supply chain company. Their foray into that with Motorola was an expensive mistake. I just don't see that a lot of the other robotics stuff they're doing is relevant to their business. That doesn't mean nobody should be investing in this. Sure there are applications down the line, but are they relevant to Google?

The only one I can think of is data centre build out and maintenance. But Google aren't near the scale to support an entire industrial robot development and manufacturing industry just for a handful of data centres.

4
resoluteteeth 11 hours ago 0 replies      
> 17 Prominent Roboticists Think Google Should Do with Its Robots

Apparently this includes having robots write titles for HN posts?

5
monk_e_boy 11 hours ago 2 replies      
> For autonomous deliveries, it's the last 10 meters that is the hard part.

Couldn't someone build a delivery box (like an american mail box) that could be mounted on your property, close to the curb, within reachable distance from a delivery truck.

The truck could stop, tell the 'amazon delivery box' to unlock and open. Then place items in it. There could be various different sized boxes, from small to large chest freezer type things. The box would have some electronics in it, a solar panel on top, it doesn't matter how expensive they are - you can rent them from Amazon or whoever.

This would then reduce all the complex variables of how to drop packages off at the destination.

6
reacweb 10 hours ago 1 reply      
Google is doing research in AI to reach the singularity. I think robotic is very important for Google, but reassuring people is more important. Google does not want to be associated with something frightening like "terminator".
7
Isamu 10 hours ago 0 replies      
What, nobody wants tap-dancing? Vaudeville anyone?

Geez, that was the FIRST thing Walt Disney went for. Look up "Project Little Man" with Buddy Ebsen as the model.

http://www.waltdisney.org/blog/early-days-audio-animatronics...

8
bliti 10 hours ago 0 replies      
I think Google should continue with its self-driving effort. They have their maps and fiber technology that provide some hard-to-get pieces of the puzzle. This is the robot that will impact humanity the most. An automated box on wheels (seats optional). Just think of the things a self driving car could do for you:

- Pick up anything from mostly anywhere.

- Schedule pickup around the clock.

- Move things/people securely and be able to track the contents through a camera.

- Act as a mobile living space. Not requiring a dashboard frees up a lot of space.

9
ikeboy 12 hours ago 5 replies      
Title is borked. Should have a what in the beginning.
30
The mobile games industry is kept afloat by less than 1% of users thenextweb.com
122 points by cpeterso  7 hours ago   165 comments top 28
1
rrowland 6 hours ago 11 replies      
This article and the comments here saying "This is common knowledge in the industry" and "I'm getting tired of people saying this is a bad thing" are a beautiful illustration of why mobile games suck and freemium is destroying the industry.

There are plenty of good games out there. The problem is nobody plays them. Then the devs of those games say "Fuck this, I'm out" and go on to start making pay to win games because that's how to get paid making games. It's gotten to the point where the games we play aren't even fun; we just find an easy game that lets us shoot up a shot of dopamine once in a while and settle for that. Or maybe we pick up clash of clans, play "free" for a month, then once we're hooked we pay way more than we'd ever pay for a real game every month just to be competitive. And we still lose, because someone else has more money.

The majority of popular mobile games are a costly addiction, not a hobby.

2
iaw 6 hours ago 6 replies      
This is pretty common knowledge within the industry. In fact, more than 4 years ago, they were already calling these users "Whales" (as a fishing metaphor) and actively catering to their interests.

There are people who will spend well over $10,000 a year to fulfill their compulsion for their "chosen" game. The advice in the article is actually rather asinine unfortunately:

>"Game creators should begin to look at pushing more sales after install. Swrve suggests this should be done one month after the game has been downloaded. Another way to keep keen players coming back would be to reduce the privileges given in purchases so they will need to buy more to play more."

It's implying that game creators haven't been carefully playing the analytics game since before freemium was a word. Any of these proposed changes will typically have catastrophic effects on the casual user base which could reduce game popularity and then tank the whale usage (depending on how much the game depended on network affects). Freemium games face the problem that TV networks do, there's too much alternative content out there.

>"Although, perhaps restructuring or ditching the freemium model might be the safest bet."

This is the right answer if you want to make good games. Freemium is an excellent model to optimization profit but it typically is directly counter to what can be considered a "good" gaming experience. The freemium games that have been largely successful (Clash of Clans, Hearthstone, Candy Crush, etc.) were all successful because they were easy to play for free, this garnered more popularity, which brought in more "whales" to support their creators.

I see mobile gaming as a race to the bottom, the margins are razor thin for the company, the pay is mediocre for the employees (as well as growth opportunities). Occasionally King comes along and wins it for a year but there are hundreds of these companies out there and the odds aren't good for them.

3
newobj 6 hours ago 1 reply      
Let's pretend video games on console/PC are music albums. Let's pretend board games are live music. Let's pretend mobile games are ringtones.

They're all "music", but they're all very different products.

Ringtones are cheap, disposable, mostly looked down upon, but still popular at the same time somehow.

Live music is seen as requiring a greater level of commitment; perhaps a little more exerting or rough around the edges, strangely pricey, but generally worth the above-and-beyond effort you have to make.

Music albums are ubiquitous, real effort went into many of them and they are not really disposable. There are a lot of sad "me too" attempts to capitalize on other trends created by first-movers. Most people have at least a couple they are into, and connoisseurs can dive into a pile of esoterica to unearth unappreciated artistic gems. Every once in a while, a major production turns out to actually be artistically important too, and people freak out.

What's my point? The "game" in "mobile game" seems pretty weighty, implying a closer kinship with other kinds of games, but, really they're as distant in kinship as ringtones are from "music". They're another product altogether. And ephemeral. And dangerous to tie your long-term longevity to as a content producer.

[edit: typos]

4
CM30 5 hours ago 1 reply      
This is what happens when the race to the bottom goes too far. People start realising this sort of tactic makes more money than traditional sales, so people start expecting all mobile games to be free with microtransactions and eventually you have an industry which is pretty much unsustainable in the long run.

Still, it's not as bad as it could be, at least not in the US or Europe. If you think it's bad that people are encouraged to pay money for things in mobile games here, well, the 'kompu gacha' type stuff is another level of scary:

https://www.youtube.com/watch?v=UOWFvlBPnk4

It's basically literal gambling in some areas. You don't pay for items or even advantages in game mechanics, you pay for the 'possibility' of getting characters and items. It's between 1 and 5 dollars for a roll of the dice, and the probability of getting a rare character can be less than 0.1%. There are stories of people spending thousands of dollars in a few hours playing these games in a livestream on Twitch or the likes...

Fortunately, this isn't as common in most games on the US app store just yet.

5
zongitsrinzler 6 hours ago 1 reply      
Personally I feel that both Play and App stores are partially to blame for this.

The stores really reward clickbait games and make it nearly impossible to discover good/serious games.

6
akjetma 6 hours ago 0 replies      
> "... the report looked at over 40 free-to-play games through February 2016, analyzing the uses of more than 20 million players."

If they're looking at 40 games with 20 million players during one month, they're looking at a specific class of mobile games, not mobile games at large.

Also,

> "A new report is highlighting that risk, showing that almost half of all the revenue generated in mobile gaming comes from just 0.19 percent of users. That means the other 99.81 percent of users arent worth anything money-wise to the creators."

Uh, aren't they worth 'half'?

edit: anecdotally, I've probably spent about $100 in the past year on random puzzle games that probably don't end up on the top of lists. Not sure how weird/errant I am though.

7
sharkjacobs 28 minutes ago 0 replies      
There's a semi famous aphorism about advertising:

If you're not paying for it, you're the product.

People who pay for consumable in-game resources aren't interested in the same kind of games that I'm interested in. Reading this article made me realize:

If you're not paying for it, it's not designed for you.

8
ksk 5 hours ago 1 reply      
People defending the freemium model are detestable IMHO. I don't agree with the notion that companies are justified in doing "anything" if it is going to make them money. Unethical practices should be publicly shamed as much as, and as often as possible.
9
_ph_ 2 hours ago 0 replies      
The big problem of the freemium model is, that it seems to push developers to make bad games. The model is based on making people repeatedly spend money on the game - and that is too often done by blocking the game until more money is spent, which of course creates a bad gaming experience. With a fully paid game, the incentive is to entertain the user to a point that he is happy with the purchase and spreads the word.

Around the time the mobile world got dominated by freemium games, I bought myself a Nintendo 3DS and I have not regretted it. The games are much more expensive up front, but they do offer very elaborate gameplay and long time fun.

10
onion2k 3 hours ago 0 replies      
If I download a game that's made by a company who make a loss then it's their backers/founders/previous successes who have subsidised my gaming, not the 1% of people who buy things in Candy Crush. Candy Crush players have nothing to do with what I'm playing. Consequently the premise that the 1% of the players who buy in-app things keep the industry afloat is wrong - the industry is kept afloat by the money people risk speculating on investing or producing games in the hope of becoming the next Candy Crush.
11
waterlesscloud 6 hours ago 0 replies      
I wonder if the same applies to say... Google. That they're kept afloat by less than 1% of their users.
12
jonmc12 3 hours ago 0 replies      
Tapjoy posted an infographic recently as well: http://www.adweek.com/socialtimes/infographic-whales-account...
13
increment_i 2 hours ago 0 replies      
Its truly an amazement - I can't think of another field with such advanced, accessible tools yet such a shitty market to enter than the games industry.
14
devit 6 hours ago 0 replies      
Note that this is mostly caused by the fact that mobile game developers are courting such a distribution by making "pay-to-win" games where you can pay more to get advantages without limits.

While such a structure means that a game developer can earn 10k or more from a single player, it also makes the game worse for those who aren't willing to spend unlimited amounts of money and thus drives some of them away or causes them to decide to not spend any money at all.

If this is not desired, the simple solution is to have an "unlimited pass" that gets you access to all current and future IAP/DLC content that gives a game advantage for either a fixed one-time fee or a fixed subscription fee.

This is probably not often done because the game developers believe that it would be less profitable overall.

15
zf00002 5 hours ago 0 replies      
I'm wondering when we might see regulation, at least in the US, regarding certain gambling-like activities some games have (not just mobile). What I mean is some games have you build a team of characters. They'll have an in-game shop where you buy "card packs", with chances of getting legendary or what have you versions of cards. Yet nowhere are the chances of getting those type of cards listed and even if there were, there's no regulatory body ensuring that those chances are correct.

From what I understand, Vegas voluntarily has rules in place that test this to keep from being regulated?

16
protonfish 6 hours ago 3 replies      
I am getting tired of people complaining that this is a bad thing. Freemium monetization works: developers get paid and consumers get a lot of great content for free.

I disagree with the recommendations of this article to basically put the screws to the customers until they pay up

> Another way to keep keen players coming back would be to reduce the privileges given in purchases so they will need to buy more to play more.

Keeping happy users that recommend your game/app is the driving force of freemium. Pissing them off just kills the golden goose. All you need is to make certain that if a customer WANTS you to take their money, there is always something to purchase at high and low price points.

17
tdkl 4 hours ago 0 replies      
I'd ask another question : are in the mobile world of instant gratification with social networks, image crafting and everything reachable at a click mobile games even "fun" enough to compete with all that ? If I get more gratification from other instant things, why even bother with a game ? Hell, why pay for it ?
18
jessaustin 5 hours ago 1 reply      
I haven't worked in this space, and I don't play this type of game, but ads for these freemium mobile games seem to show up everywhere. Are the ones with the giant ad spend the only ones making money? Are they not making money either? At first it seems that more discoverable app stores or even some other "curation" services could help lower the required ad level, but then it occurs to me that perhaps whales really only respond to flashy expensive ads on popular media properties?
19
fpgaminer 5 hours ago 1 reply      
The game industry was born in arcades. You know, those arcades filled with people slumped over bright, flashing machines, pulling levers and plopping in quarter after quarter. No, no, not the casino. The arcade! The one where all the games were specifically designed to make the player lose, forcing them to insert more quarters to keep playing and get another chance at winning. Yes, I'm sure I'm talking about an arcade, and not a casino...

Then came consoles, and everyone pretty much ditched arcades and stayed at home. We went from spending buckets of quarters drawn out over the course of an evening, to spending buckets or quartuers all at once every month or two, and taking out a loan to afford the next $400+ console. The arcade model died, because consoles were better in every regard. People wanted to stay home, the games were better and more engaging, the platform and context allowed for games with more depth and value than could ever be achieved in an arcade, and you didn't have to worry about the console eating your quarters. And no more collecting 10 million tickets only to realize that the best item you can redeem them for is a plastic transformers ring.

Then came PCs, and PCs and consoles lived happily together alongside one another for the next decade or two. So how does the story end? Well, arcades are back baby, but in the form of pay-to-win freemium mobile games. They've evolved; now you don't have to go to some sticky floored dungeon to play Time Cop 15. You can do that from the comfort of your own phone. Just plop in $1 every now and then to keep playing. Best of all, game devs found out a way to price discriminate. Poor? Here's some ads and patience. Rich? Push a button and Apple will take care of the rest.

But the games haven't changed. Mobile games are just as shallow as arcade games ever were. But weren't arcades great? Of course they were; those games were a blast. They're a different kind of fun. A kind of fun you can pick up, enjoy, and then leave without another thought. These aren't grand masterpieces like The Witcher 3, or thought bending games like The Stanley Parable. But they were still fun, exciting, and a great way to relax or kill time.

So, my question is, why is everyone so caustic towards mobile gaming? Because it's exploitative? Sure, I can understand that, but so were arcades. If I had to guess, it's perhaps the pervassiveness of the games and the manner in which they present themselves that people find so offensive. See, arcades were at the arcades. You had to drive there, and physically be there for awhile. Mobile games are constantly with you, just one finger swipe away. And they take advantage of that, with notifications and constant come-hither looks to get you to play and spend. Also, arcade machines were upfront about their pricing model. The quarter slot is right there on the front of the machine, and the screen is flashing "Insert Coin". Mobile games advertise themselves as free, and even let you play a little bit before revealing their true payment model. That's deceptive in ways arcades were not.

No real point other than to remind everyone of history. It's clear that mobile gaming shares a heritage with arcades. And it's perhaps clear that they've evolved the model, but perhaps that evolution is not what everyone was hoping for. I'm more curious what the next "console" evolution will be.

20
vlunkr 6 hours ago 1 reply      
> is so reliant on a few hardcore users for revenue

I might replace 'hardcore users' with 'children who don't know what real games are'

21
dave2000 6 hours ago 2 replies      
"Although, perhaps restructuring or ditching the freemium model might be the safest bet"

Sadly there wasn't room in the article to elaborate on what form this might take.

22
55555 6 hours ago 0 replies      
I would love to see similar statistics for gambling and taxation.
23
rocky1138 4 hours ago 1 reply      
There is a potential market for an App Store which refuses to offer any free to play titles.
24
DrNuke 6 hours ago 0 replies      
The most successful game I made was my first and simplest but it was Nov 2011. By mid 2014 I was outmarketed as an indie dev.
25
someguyfromwi 6 hours ago 1 reply      
Am I the only one who sees that this is eerily similar to the US federal tax model, the 1%, etc?
26
Zikes 6 hours ago 4 replies      
0.19% of mobile users have "more money than sense".

99.81% of users don't see the point in buying the $5 "valu-pack" of [insert contrived currency here] to unlock an extra few hours of playtime every day.

Mobile developers get angry and post rants about people being averse to spending as much as a cup of coffee on their abusively designed time sinks.

And other gaming industries are starting to follow suit. There are AAA console/PC titles with microtransactions now, it's ridiculous.

27
shkkmo 4 hours ago 1 reply      
Why is this article getting up-voted on HN? It is poorly written, contains very little information and completely mangles it's interpretation of the statistics.
28
Shivetya 6 hours ago 0 replies      
Isn't the freemium games on other platforms more of the same? I play a few and there never seems to be a game without those few who buy up every perceived advantage they can. If not advantages then every special item or unit.

I remember Mechwarrior Online coming out with GOLD colored mechs for some obscene price and seeing them in game. I want to say they were like five hundred bucks. Wargaming also follows a similar model but I haven't see individual tanks or ships cresting a hundred bucks but there are enough that are in the fifty range and they are plentiful in matches

       cached 25 March 2016 01:02:02 GMT