hacker news with inline top comments    .. more ..    8 May 2016 Best
home   ask   best   3 years ago   
SpaceX lands rocket at sea second time after satellite launch phys.org
804 points by dnetesn  2 days ago   464 comments top 26
kirrent 2 days ago 3 replies      
For those wondering, this isn't a simple repeat of CRS8 which landed on a drone ship about a month ago. CRS8 was a mission to low earth orbit which left the first stage with plenty of fuel to effect a landing. The landing was made as slow as possible and limited only by how low a single engine could be throttled. The re-entry burn which slows the rocket down before the landing burn was also more agressive.

This mission was to launch JCSAT-14 to geosynchronous orbit which requires the falcon 9 to move the already heavy satellite into a geosynchronous transfer orbit. That meant the first stage had to do a lot more work and was left with a lot less fuel as a result. Therefore, the re-entry burn was less aggressive, the first stage came in with twice the speed, and SpaceX needed to do a far harder landing with three engines lit in the quicker and riskier suicide burn. Somehow, despite playing down expectations, they managed an even more precise landing than last time.

geerlingguy 2 days ago 2 replies      
What's notable about this attempt (as opposed to the last) is that the first stage rocket was traveling twice as fast (4x the energy to overcome on landing), and didn't have enough propellant to do a 'boostback' burn.

Instead, coming in very hot, the rocket had to use three engines (instead of one) to slow for landing. The last time this was attempted, the first stage put a nice hole in the deck of the drone ship.

Seeing the rocket dead-center on the barge was quite a sight!

manaskarekar 2 days ago 2 replies      
Hosted webcast: https://youtu.be/L0bMeDj76ig

Landing at 38:18mins https://youtu.be/L0bMeDj76ig#t=38m18s

Great info on the thrusters on the first stage at 19:37. https://youtu.be/L0bMeDj76ig#t=19m37s

Edit: Thank you and sorry about the time stamp link, I posted quickly from my phone. Hopefully it works on shortlinks.

js2 2 days ago 0 replies      
Webcast - https://youtu.be/L0bMeDj76ig

First stage landing is just after 38 minutes - https://youtu.be/L0bMeDj76ig?t=38m

EA 2 days ago 2 replies      
Doing it one time is a technological leap.

Doing it twice in such a small window of time is a logistical/programmatic leap.

mulmen 2 days ago 4 replies      
Looks like the landing was perfectly centered on the barge this time as well. Does anyone know when they plan on reusing one of these recovered first stages?
ino 2 days ago 2 replies      
How are non-US space agencies and companies (like ESA, Russia, Japan, China, etc) reacting to the SpaceX and Blue Origin achievements?

Are they also testing similar things but we aren't hearing? Or they aren't threatened by the advances?

iamcreasy 2 days ago 1 reply      
A white ball of fire. And then there it is...resting! Beautiful!
zupreme 2 days ago 10 replies      
I remain confused about why SpaceX is getting so much fanfare and praise. What have they accomplished that NASA didn't already accomplish during the Apollo missions?

I know SpaceX is doing it all more cost effectively, because we have better technology, but have they actually accomplished anything tangible that NASA didn't a generation ago?

grondilu 1 day ago 1 reply      
This success was unexpected. That means that they can recover a stage from a higher speed than previously thought. Does that mean that the recovery of the second stage from LEO may actually be feasible?
tdrd 1 day ago 0 replies      
Here's the landing from the webcast video on youtube https://youtu.be/L0bMeDj76ig?t=2300
smegel 2 days ago 1 reply      
This is one of those turning points in human history.
Gravityloss 2 days ago 0 replies      
It could be argued that flying to orbit with a first stage that's being reused will be as historical when SpaceX will do it.

The Space Shuttle did reuse SRB casings and the orbiter, it was a bit different but a great achievement too.

However this time there is more potential for cost savings. It can still happen that they can't be realized because of some details or even fundamentals we don't understand from the outside.

Kinnard 2 days ago 1 reply      
I wonder if there's coordinate potential with seasteading? Astronauts are gonna want all sorts of goods and services upon landing I imagine. Having someone stationed nearby will mean swifter pickup of equipment and astronauts. And if there's someone stationed nearby they'll want goods and services as well.
leecarraher 2 days ago 10 replies      
Am i missing something, is the need to land vertically a requirement of a fragile fuselage? Seems wasteful to carry extra anything (in this case rocket propellant) to space, just to avoid having to re-right the rocket when you get it back on earth for a subsequent launch. Is extra fuel payload < landing gear or parachutes?
brianwawok 2 days ago 5 replies      
Here is the video


Hopefully we won't have the typical hacker news discussion about why the employees chant USA.

ck2 2 days ago 1 reply      
The long-term plan is to put humans on these right?

How many years out is that?

Ferver777 1 day ago 1 reply      
Simply incredible. Elon Musk is the New Thomas Edison
hcrisp 2 days ago 0 replies      
"One if by land, two if by sea..."
samstave 1 day ago 1 reply      
Can someone please explain to me how the falcon orients itself during this total process??

Does it use fins, or simply engines and whatever those little jets are that, say, the space shuttle had around its nose...

How does the falcon manage to physically orient its body?

Also, is that process completely autonomous? Is there a remote flight engineer steering this to the barge - or is it completely self-guided

hoorayimhelping 2 days ago 4 replies      
>P.S. Downvotes for this question? Really?

There are always people on HN who love being contrarian and trying to diminish amazing things. It's like they hate seeing people excited for something and want to point out that this isn't that big of a deal.

We're literally watching things happen for the first time ever, and this person's response is "what's so great about this?" while implying that people who are excited about this are fanboys. I can understand the downvotes honestly.

neals 2 days ago 14 replies      
I always try to watch the webcasts and I can't help but cringe so hard when they start chanting 'USA USA USA', every single time.

What is up with that?

jklinger410 2 days ago 4 replies      
I find the level of criticism for SpaceX on this site to be incredible.

I understand this is a place for intelligent discussion, but sometimes I feel like I can't find a single thread on this site without someone in the comments presuming they are smarter than someone else.

thrownear 2 days ago 1 reply      
Ok. Can we have less of Tesla/SpaceX posts. There is already one discussion about this in the front page...
SagelyGuru 2 days ago 4 replies      
Good work! I still think that they should have some grippers on the barge, or a net, ready to spring and grasp the rocket on landing, instead of carrying flimsy legs all the way to space and back.
gist 2 days ago 2 replies      
The big difference between landing at sea and on land seems to be the PR value. Otherwise a target is a target, taking any differences in weather into consideration. In fact it seems that landing at sea is actually less pressure since you don't have the same safety concerns and crash pictures (which are bad for publicity) the rocket simply disappears.
Apple Stole My Music vellumatlanta.com
1238 points by panic  3 days ago   442 comments top 83
funkyy 3 days ago 6 replies      
I am surprised at people trying to rationale Apples wrongdoing by pointing at Google and bringing some crazy examples.

There is no rationale in this - this is outright breaking your privacy and ownership rights. No terms and conditions can be above law. It doesn't matter what others do - Apple is doing those crazy things here and there trying to test the ground which indicates they are not pro privacy and pro user, but rather are willing to go huge lengths to please music industry.

Only because you eating in the restaurant it doesn't mean waiter can run to your house and smash all of your food in the fridge reasoning "from now on you are covered".

rossng 3 days ago 15 replies      
Every time I am asked to set up someone's Apple device, I find it incredibly difficult to:

* Get it synced properly

* Determine what is stored in iCloud/on-device

* Ensure that device contents are actually backed-up, unless done manually

* Set up simple things like email accounts

Just a month or two ago, I was helping someone whose iTunes music wouldn't sync to their iPhone. It turned out that, when they signed up to Apple Music, iTunes had silently flipped on a setting that prevents this. Working out what on earth was happening took me almost an hour.

Yet everyone tells me that their Apple devices 'just work'. I don't have the same experience - I find their behaviour to be utterly opaque and non-deterministic. Am I alone?

6stringmerc 3 days ago 1 reply      
At first I thought the tone of the article was a bit hyperbolic, but upon reading further, no, this totally fits with the emotions that I'd probably feel upon such a situation. Shock, then horror, then anger. Then solution minded...then when hitting a wall...

>When giving the above warning, however, even in my most Orwellian paranoia I never could have dreamed that the content holders, like Apple, would also reach into your computer and take away what you already owned.

It just feels dirty, and, as my Software Developer Uncle probably would've called it, "Playing outside the sandbox." I mean, sure, as the article notes, the TOS gives Apple a lot of consent, but "Loss or Damage" via incidental use vs. outright deletion via intentional coding feels...different. Maybe legally they're not...

I do remember ranting at the top of my lungs after an online jam software installed an update and crashed my Win7 PC laptop so hard it had to rebuild via a command prompt screen. By then, I already had CD backups, a USB HD 500GB full of projects, and it was a cold reminder. The laptop restored fine, but whoa, not fun. Not what I signed up for in the agreement, risk-wise, I felt, so I've essentially stopped using that software.

...and I'll close by reminding myself I'm perfectly reasonable with my Win laptop setup, not running iTunes (Winamp), and backing up to a local cloud or other media (another USB HD coming soon). Life happens, accidents happen...but there's some funky software out there.

Heck of a story, and one I will point to gladly when discussing paths for audio DAW hardware/software platforms.

Terretta 3 days ago 5 replies      
Apple iTunes Match (and Apple Music) subscriber, with 23,000 hand ripped songs, only about 18,000 able to exact match by Apple, 5,000 are less common versions.

Also have enabled iCloud Music Library. Note these are three different services, and iCloud Music Library has been the most likely culprit for monkeying with your music, not Apple Music.

The combo of all three has deleted or auto-replaced exactly zero of my custom rips. I can use a series of steps (smart playlist to find matched songs, then manually delete that set) to shift to using Apple's high quality unprotected version, or not.

I wonder if the key to everything working as you imagine is the Match subscriptions:


If using Match, then after matching, manually deleting local, then re-downloading, you end up with a very high quality file without DRM that will continue to work fine and be portable, even after you cancel.

I'm not sure what happens if using only Apple Music without Match.

The linked article sounds a lot like the iCloud Music Library beta problems in July 2015:


And similar support experiences:


onion2k 3 days ago 6 replies      
I can't think of a single reason why Apple would want to delete the files from the user's computer apart from an intent to lock the user in to the service by making it tremendously hard to leave. That's the black hole of UI dark patterns.
buro9 3 days ago 10 replies      
It's worth noting that Google Photos actually does something similar... the photos are "backed up" even in their "original" form... but I don't believe it's possible to actually restore (download all) from Google Photos.

I thankfully have a local backup of the photos I took, but when a phrase like "backup" is used, it is implicit and understood that there is a "restore" mechanism.

Google Photos lacks a "restore" mechanism, and it sounds like the same is true of Apple Music.

Google Play Music also does the matching/mismatching thing.

An uploaded Ladytron - Gravity the Seducer was replaced by a remix album but remains tagged as if it's the original. This is probably due to them not having the original, and this was a 90% match based on tags... but 90% is not good enough. I worked in the music industry and have so many demo tapes, master cuts that were not subject to post-production, etc. I want the version I have, and not some approximate guess at something similar-ish.

This isn't just an Apple issue.

hudo 3 days ago 3 replies      
Similar what happened to me years ago - i got new iphone, installed itunes, it synced all local files. Few weeks later, i de-synced the phone or something like that, and all my local files were gone! Still not sure how or why, there were no warnings or anything. From that day, i stay as far as possible from iTunes and similar "smart music sync" apps. Spotify doesn't see my local music collection, and vice versa. All music streaming services and their apps are total crap, they try to lock you in and manage everything like you're an idiot.
drbawb 3 days ago 0 replies      
iTunes is the driving reason behind why I got rid of my iDevices.

First of all using iTunes for Windows has a way of making its users feel like something of an afterthought.

Secondly I was always scared to sync my devices. Is this the day it tries to undo my jailbreak? Is this the day that deleting a playlist actually removes all the songs from my device? Is this the day that it removes an app I depend on because Apple decided to kick it out of the app store?

Somewhat relatedly I never felt comfortable plugging my iPhone into my secondary computer. The whole process of authorization and syncing is just horribly opaque.

It's some of the most opaque software I've ever had the displeasure of using. How they managed to fuck up a file copy[1] so badly is beyond me.

Android isn't without it's own faults, but at least it feels like it's own independent device, with its own copy of my library. I don't want a complicated sync mechanism: I want to put this new album onto my phone.

[1]: https://www.foobar2000.org/components/view/foo_dop

Animats 2 days ago 0 replies      
It might be worth filing a criminal complaint with the FBI, under the "exceeds authorized access" provision of the Computer Fraud and Abuse Act.[1] Apple's EULA [2] does not give them unlimited access to your computer. It just keeps you from suing them in civil court over it. You probably need a lawyer and a press agent. Someone really needs to take this to court.

See the Justice Department's CFAA guide [2], under "Intentionally Damaging by Knowing Transmission". Also read the section on "Exceeds authorized access", starting at page 11.

This guy was told that the software was operating as intended. That shows criminal intent. It eliminates the defense in the CFAA under "No action may be brought under this subsection for the negligent design or manufacture of computer hardware, computer software, or firmware." The CFAA has a civil suit provision: "Any person who suffers damage or loss by reason of a violation of this section may maintain a civil action against the violator to obtain compensatory damages and injunctive relief or other equitable relief." That may override the EULA, but this needs legal advice.

[1] https://www.justice.gov/sites/default/files/criminal-ccips/l...[2] http://www.apple.com/legal/internet-services/itunes/appstore...

musesum 3 days ago 1 reply      
> "Amber relayed to me that shes had to suffer through many calls from people who cancelled their Apple Music subscription after the free, three-month trial, only to discover that all of their own music files had been deleted and there was no way to get them back."

The Apple Music three month trial had a positive affect on my listening habits. I tried it, hoping to discover new music. But the UX was a slog. No easy pivot points on artists and songs. A DJ workflow vs listener workflow, which added complexity. No collaborative filtering.

So, I switched to Spotify. The "Discover Weekly" section that uses Echonest (I think) has surfaced new artists and songs to explore. Pivot points on songs, artists, and playlists are straightforward.

Have spent 100s of hours hours ripping and curating high 100s of CDs over the last 25 years. Barely touch em. At this point, if you'd ask me to choose between my old catalog and Spotify, I would choose the later. Heh, thanks to the Apple Music 3 month trial.

andrey_utkin 3 days ago 1 reply      
Enjoy your cloud service.

"FUCK THE CLOUD" by Jason Scott: http://ascii.textfiles.com/archives/1717

Proud user of Linux and SELinux (mandatory access control system, which protects your data from your apps).

mwexler 3 days ago 0 replies      
As the post mentions, in most cases of mainstream media, we all agree to licensing our purchased media, not owning it, and so publishers and resellers assume that all media in our possession must be purchased, and so are under their control.

Of course this ignores user created media, or freebies, or gifts... but even for the purchased access, I always wonder about that forced (dare I say clickwrap?) licensing aspect. Why should I pay again for different format of same info (amazon format vs. epub)? Why should NYT force me to pay extra for tablet AND phone access to same content? Why should I pay extra for a digital copy of a movie that I already have on Bluray? And of course, why should I be forced to allow the publisher to control media on my drive?

We see this with books (printed books say that you have a license to read them, but not to redistribute in any way electronically without permission, even snippets or out of press books (the long running Google books issues) or Amazon revising kindle books on your device) and as the post mentions, we see it with music and movies/shows as well.

It's not as easy to distribute content these days as we may think; many book publishers tried direct to consumer ebook sales over the last few years and are pulling out of that game (the fact that o'reilly and tor worked so far is because they give flex and target techies). Libraries are shifting to emedia services like BiblioBoard and Hoopla which don't even let you download the content except in very restricted apps; instead, you read images online in a browser or stream only. The goal again is to enforce the licensing and get consumers out of the "own" mandate.

At the end of the day, content creators should have a say in how their content is consumed and sold, publishers have demanded a say, and resellers want to own the customer and their data. The current model is too adversarial; I hope we can come up with a way to reward content creators while still allowing a reasonable flexibility of consumption and appreciation.

PS: We seem ok when Netflix drops a movie because we understand we are renting access to their basket. I guess publishers want that too, only they want a higher per unit price with even more control. Sigh.

mjw_byrne 3 days ago 2 replies      
I've been telling people for a long time to avoid Apple because it's an organisation which has demonstrated a pattern of contempt - to its customers, to its competitors, to the developers who use its app store and to the courts. The replies I get are so depressingly apathetic: "but the iPhone is nice"; "I like how simple it is to use"; "the design is so pretty". How can people be so easily charmed into bending over for out-of-control megalomaniacs? Maybe the secret to Hitler's popularity was the uniforms, which were admittedly really snazzy (Godwin's law, I know, I know...)
pkorzeniewski 3 days ago 1 reply      
This is ridiculous, but that's what you get by trading ownership and privacy for convenience. I keep all my music in .mp3, movies in .avi and books in .pdf, which means I can access them on any computer, using any software and I don't need to worry that one day I may loose access to them. The world is going mad, I wonder when (or if) people will realise how much control over their stuff is in 3rd party hands, and it's just the beggining - we're now entering the era of IoT, autonomous cars and so on. Fuck all of that, I'll use "dumb" stuff as long as possible, even if sometimes it's less convenient - at least I don't feel like a sheep following trends while getting more and more dependant on corporations, whose only interest is to squeeze you as much as possible.
kalleboo 3 days ago 1 reply      
What would cause iTunes to actually delete the music? AFAIK, it doesn't have an "Optimize Storage" option like Photos does, and it's never deleted any files for me even when my disk is down to 0 bytes free.
andremendes 3 days ago 0 replies      
Wow. It's hard to believe they were capable of fooling us this long. I mean, how can we rationalize after they saying they will keep our files and if we end business with them, good bye?This should be unacceptable, things like this makes me sympathize a bit more with Stallman's hardcore philosophy about software.
boxfire 2 days ago 0 replies      
All I can think of is my Wife's photography collection. She has terabytes of photos. Imagine the day "iPhoto" works like this (which is probably not too far off).

We have her do a manual backup periodically, but now I am going to automate that and make it robust. I am honestly afraid for the future of that data. This is just another part of why I am glad all of my own important data is on Linux machines that I have relatively strong control over, and backup automatically.

What a time to be alive! I can't wait until they can edit your memories out of your brain and store them on the cloud, you know for more capacity and safer recall.

firegrind 3 days ago 1 reply      
I'm surprised that it's not the re-encoding of the victim's compositions that grab this headline.

Could I copyright a piece of unpublished music, then use the converted copy to show a case against Apple for copyright infringement ?

abalone 2 days ago 1 reply      
So, let's be clear about one thing: This is officially NOT how the software is intended to function. According to Apple, original files are never deleted: https://support.apple.com/en-us/HT204962

I realize the blog post says an Apple rep told them otherwise, but if true, they were wrong.

Also, who says "the software is functioning as intended"? That's not the style of casual speech that Apple support reps are trained to use with real customers.

dendory 3 days ago 0 replies      
A long time ago I started to use the iPhone music app. Then I quickly discovered that songs sent to the iPhone could not be sent back the other way. This was just the beginning of Apple's lock-in. I got out of it right then and there. Now I store my music on my own server and use one of the many third party music apps to play them.

If a service smells of lock-in, you can be sure things won't become better, they will become worse like the issues people get with Apple Music and iCloud. You care about your files? Then you manage them, don't trust some service you have no control over.

logan5 3 days ago 0 replies      
I had a non-English song on my Mac and iPhone prior to Apple Music subscription. Now, my iPhone has a very different non-English song altogether.

What seems to have happened is, Apple incorrectly matched the original song to another non-English song. Later, it deleted the original song on my iPhone and gave me the incorrectly-matched version.

V-2 3 days ago 1 reply      
Looks like RMS was onto something after all ;)
konart 2 days ago 2 replies      
Never understood why people use iTunes. If I'd ever had to name the worst Mac app - this is it.

Store your music, listen via http://swinsian.com for example), back it up, be happy.

alwaysdownvoted 3 days ago 1 reply      
It is a known fact that Apple tries the same ideas more than once, sometimes years apart. If at first they fail, they will try again.

I recall an idea from the past where they wanted users to disclose to them all the user's non-itunes music in return for some perceived benefit. At the time I thought of this as a way for someone at some company to assess out how much CD-ripped, Napster-shared, or other indepedently-sourced music was still out there. Needless to day it didn't fly.

I have never in my life used itunes. I can tolerate most of today's "walled-gardens" but not one that seeks to place a surcharge on friends sharing music, which has always been the essence of how my music collection was built (pre-digital). I would give up music before I would sign on to letting Apple control my music collection.

If there were a robust, tiny command-line version of "itunes" that would run on any computer, I might reconsider. But that's not happening either. That's the true reason I have never used itunes. Strong distaste for the proprietary Apple-only graphical software.

Olap84 3 days ago 2 replies      
Reasons to pirate number 567839234
jonconley 3 days ago 0 replies      
I've had this same thing happen.

Not only do I have countless files gone, if you let the subscription lapse... You don't get your music back and it wasn't there when I renewed.

Also, the cellular providers have to love how much of my own music I'm downloading over their costly data services.

gcatalfamo 3 days ago 0 replies      

"through the Apple Music subscription [...] Apple now deletes files from its users [...] iTunes evaluated my massive collection of Mp3s and WAV files, scanned Apples database for what it considered matches, then removed the original files from my internal hard drive. REMOVED them. Deleted. If Apple Music saw a file it didnt recognizewhich came up often, since Im a freelance composer and have many music files that I created myselfit would then download it to Apples database, delete it from my hard drive, and serve it back to me when I wanted to listen, just like it would with my other music files it had deleted."

sleepless 3 days ago 0 replies      
The power of backups! Also the power of not using cloud services...
dwighttk 2 days ago 0 replies      
This is awful and I'm glad I stayed away from the 3 month trial because I would hate for this to happen to me.

Also: I know of a few people who would love for this to happen because it would open up a lot of space in their hard drive which is almost completely full. "I subscribed to Apple Music and freed up 100 GB! Sweet!"

That being said there should definitely be a way to avoid this and I would personally prefer it to be opt-in than opt-out.

nerdponx 3 days ago 0 replies      
> iCloud Music Library is turned on automatically when you set up your Apple Music SubscriptionWhen your Apple Music Subscription term ends, you will lose access to any songs stored in your iCloud Music Library.

That wording does not suggest to me "all of the music in iTunes will become part of your iCloud music library." You can't sue for damages, but you can sue them for lying or concealing information in the EULA

emodendroket 3 days ago 0 replies      
When I used iTunes Match they wouldn't delete your files unless you specifically asked for it... which seems to me like a more sensible default behavior, but I guess Apple is getting pretty aggressive about trying to change user expectations.

e: Although one very rich thing is when I wanted to cancel it was impossible to do without installing iTunes on my new computer until I contacted support to complain.

snowwrestler 3 days ago 0 replies      
I like a lot of what Apple does. I love their computers and phones. I've posted here numerous times about that.

But I have not turned on Apple Music, or iCloud Photos, for this very reason. Cloud syncing services are a DISASTER for Apple.

I recently bought my wife an iPhone SE, and upgraded some older computers to El Cap--fresh install + migrate files so I could get a clean start. But turning on the cloud services, and configuring them to do what I expect, was so complicated and scary that I just gave up. I'm syncing Safari bookmarks and Notes, and that's it.

And even the Notes are a mess. I'm stuck with separate local and iCloud folders of notes, because I simply don't trust Apple to not screw up or delete my local notes in the course of trying to sync them.

Rule #1, #2, #3 of cloud syncing should be "do no harm." DropBox has largely figured this out. Apple has not.

It's funny to think back to when Steve Jobs said that DropBox is a feature, not a business. Well, it's a feature I use and happily pay for, and it's a feature that Apple has yet to figure out.

gumby 2 days ago 0 replies      
I'm a long term Mac user (primary desktop/notebook since mid 90s), but I do feel Apple is cavalier about the Hippocratic oath of data ("First: do not lose or transform the user's source data").

Since my mac is basically a unix machine it's trivial for me to make sure this rule is followed, but it's not trivial for most people.

tabulatouch 3 days ago 1 reply      
Nothing beats offline data.
k-mcgrady 3 days ago 0 replies      
I don't see how they can say this is a feature - because it didn't happen to me. I have a huge amount of music that they wouldn't be able to match and they didn't delete any of it. Maybe this is because I'm also using iTunes Match. Your issue might be iCloud Music Library (not Apple Music).
makecheck 2 days ago 0 replies      
I dont understand why deletion would even be necessary in a case like this. Are people filling up their drives that quickly? If a service thinks that some data has become redundant, just move the original out of the way and tell the user where it went and whats in that stash.
darreld 2 days ago 0 replies      
I went all-in on Mac in 2001. Pretty unix and awesome hardware. I used iTunes as soon as it was available. As a matter of fact I'm actually looking at a 1st generation iPod on my desk right now (preparing it for eBay).

I have steered clear of this service from the beginning, partly because I'm old and want my own old library and partly because I am very aware that Apple can NOT do services of any kind, at all. They have botched everything they have ever tried to do online. .me, .mac, accounts (I have 3 and they cannot merge them).

I feel for this guy but, given their history, it's pretty much a slam-dunk that Apple would drop the ball if it's a service.

oneeyedpigeon 3 days ago 2 replies      
This is exactly why I've held onto all my CDs, and why I continue to buy the majority of my music in a physical format. I once had Apple ... iTunes match? iCloud? I can't even remember what it was called ... decide that an Eminem album I had was too racy and serve me a censored, unlistenable version instead. And the complexity involved in keeping any kind of sizeable music library in check using iTunes is overwhelming. I have two apple ids, some music on an external hard drive, some on an internal hard drive, some subset/superset/overlapping set of which is 'somewhere' in the cloud; it's a nightmare. Is there a sensible alternative to iTunes that will a) not do anything with 'the cloud' b) sync to my iPod?
jmh42 2 days ago 0 replies      
While this is entirely unethical, it is an amazing business decision if they get away with it without any harm to their reputation. I wonder if they reputation risks were considered when they rolled out this "feature". It's a good example for Bruce Schneier to add to the next edition of "Liars & Outliers: Enabling the Trust that Society Needs to Thrive".
cha5m 3 days ago 0 replies      
Stuff like this often makes me question using any DRM or platform-locked services, particularly steam. If valve decided to make a shady decision there is very little that its users could do about it.
bitJericho 3 days ago 1 reply      
I'd sue em regardless of the EULA. But I don't use apple anything, and I run my own "cloud" and run my own backups. So nothing for me to worry about.
demarq 3 days ago 0 replies      
Might this also be a problem with dealing with big a provider?

If this were a much smaller smaller company you might actually get to talk to someone who can do something i.e "let me see what files we uploaded from you... oh we'll send you a link...". With larger companies you usually talk to "explainers", or people whose job is to explain to you that you are screwed rather than explore solutions to your problem.

joesmo 2 days ago 0 replies      
Once again we have criminal action covered by the CFAA completely ignored by the government because a large corporation is doing it. Serving malware knowingly? No problem. Overstepping your access and deleting files that aren't yours? No problem. But good fucking forbid an individual even types a wrong url and they're in jail. Double standard much?
imgabe 3 days ago 0 replies      
I've had exactly one iDevice, an iPhone 3G, and the experience with iTunes turned me off to them forever.

What should be a relatively simple task, plugging in a drive-like device to my computer and moving files from one place to another is made needlessly complicated and restrictive by forcing it to slog through the tar pit that is iTunes. Never again.

jasonthevillain 3 days ago 0 replies      
Oh yeah. Never allow iTunes to manage your music (there's a checkbox.) In 2010 it wiped out the metadata from about 3000 songs and decided they were all united by unknown. I tried a few autotagging libraries but had to do at least half of it manually.

Sadly I haven't found a decent replacement for it.

joshuaheard 3 days ago 0 replies      
Not an Apple user, but I never sync streaming services to my personal collections for fear of cross-contamination. I always set up a dedicated folder for the streaming service, and if necessary, copy my collection to that. Plus nightly full backups to a separate internal drive and Carbonite.
naryad 2 days ago 0 replies      
Really sad about what happened to blog author. This reminds me of the South Park episode https://en.wikipedia.org/wiki/HumancentiPad manifesting in reality.
zyxley 2 days ago 0 replies      
There's obviously some kind of bug here, and one that Apple should rightfully be lambasted over, but given that plenty of people, including me, have used Apple Music without anything local getting deleted, it's pretty obviously not intended behavior.
colin_jack 3 days ago 0 replies      
I unsubscribed from apple music when I found out about this and discovered iTunes would not allow me to sync music I bought from bandcamp onto my iDevices whilst using the iCloud support.

Does seem to be working times for apple, product wise they are making of decisions.

dibujante 2 days ago 0 replies      
Isn't this a felony under the Computer Fraud and Abuse Act? It's intentionally exceeding authorized access to a computer and intentionally (not even recklessly) causing damage.
feld 3 days ago 0 replies      
iTunes has readonly access to my music stored on my NAS. This is precisely why.
davidhariri 3 days ago 1 reply      
I believe there's a setting when you initialize your library that allows iTunes to manage your library. If this is turned on, when you drag a file into iTunes, Apple copies the file into its own local storage so you have two copies of it. If you delete a file in iTunes it will delete it's local copy. Usually digital collectors will have this turned off if they're using iTunes since they like to organize and name their files elsewhere and they don't want to have two copies of the file. If you manage the library yourself i'm 99% sure you would avoid any funny business, even with an Apple Music subscription. The interplay of Beats, Apple Music, iTunes Match, audiobooks and iTunes was definitely a blunder. In my opinion they should all be separate apps.
donatj 3 days ago 0 replies      
Is this a setting or something? I use iTunes Match and most certainly still have all my existing local copies. I know because I regularly back them up and they are certainly still there.
loup-vaillant 3 days ago 1 reply      
Proprietary software working against the interests of the user is not really surprising. What does surprises is how visible and immediate the effects are.

Seriously, just use free software.

gp7 3 days ago 3 replies      
wavs having "more samples" is a new one to me. Neat.
macawfish 2 days ago 0 replies      
Apple was overstepping the boundaries a decade ago. I guess this is turning out to be a multigenerational struggle.
Dorian-Gray 2 days ago 0 replies      
I am a published music producer, and I would suggest pursuing legal action against apple if their policy has resulted in a person's self-created music files being copied and manipulated. This is copyright infringement, and would be damaging to any artist - especially if the served self-created tracks are compressed significantly prior to their storage in apple's servers. I know that my self-created music files represent years of my life, and for those files to be permanently damaged would be a huge blow.
mring33621 2 days ago 0 replies      
All I can say is that I'm sooo glad to find out that I'm not the only one who struggles with and is afraid of iTunes.
jvagner 3 days ago 0 replies      
I was going to come here to say that this was noted behavior when Apple Music Match debuted, but apparently very few people are aware of it.
investinwaffles 3 days ago 0 replies      
This behavior also occurs in the traditional iTunes store if a record label alters the track list of or re-releases an album.
Chromozon 2 days ago 0 replies      
Any lawyers around? I feel like this case would be simple to win in court. Intentional property damage.
thevibesman 2 days ago 1 reply      
I rebuttal to the idea that Apple is "stealing" music was posted: "No, Apple Music is not deleting your music unless you tell it to"

[1]: https://news.ycombinator.com/item?id=11638308

Globz 3 days ago 2 replies      
What is a good alternative to iTunes to manage music (not cloud related) on Windows?
minusSeven 2 days ago 0 replies      
At this point I am really glad I could never afford an apple product.
croon 3 days ago 1 reply      
Cue the Apple rationalists in 3... 2... 1...

That really sucks, but I do hope you have a local backup and/or offsite backup and not mentioning that for effect?

It's still amazingly evil by Apple, but I hope you still have your files.

dandare 3 days ago 0 replies      
Am I the only one who recalled the 1984 ad an cringed?
ommunist 3 days ago 0 replies      
VOX is the answer. And ... geeks backup.
intrasight 3 days ago 1 reply      
This is not directed at anyone in particular, but:

"syncing" with a cloud service - stupid

not backing up - stupid

msimpson 3 days ago 0 replies      
I smell a lawsuit.
MrPatan 2 days ago 0 replies      
It just works ;
PaulHoule 3 days ago 1 reply      
iTunes is malware. It's the #1 reason why the first thing I do when I get a Mac is install Windows on it.
cscawley 3 days ago 0 replies      
Buy a NAS
pietrasagh 2 days ago 0 replies      
use computers not apples
mbrutsch 3 days ago 2 replies      
Lie with dogs, you get fleas. I have zero sympathy for those who fall prey to closed ecosystem BS like this, including my own wife. People just don't listen.
g4z 3 days ago 1 reply      
exactly! every time i read about someone having a problem with something, i always check to see if my friends have had the same problem. if not, then clearly the problem doesn't exist. pri.ck
dang 2 days ago 0 replies      
We've banned this account for egregiously violating the HN guidelines. If you don't want it to be banned, you're welcome to email hn@ycombinator.com.


We detached this comment from https://news.ycombinator.com/item?id=11639435 and marked it off-topic.

Hydraulix989 2 days ago 1 reply      
"Apple stole my stolen music" ftfy
noonespecial 3 days ago 1 reply      
Sounds harsh but all I read was that someone had 122 gig worth of files they considered precious to them and no backup.

Computer systems fail for all kinds of stupid reasons. What will you do when that disk fails?

I admit this is a particularly stupid and infuriating reason for data loss, but how little would it cost to have a backup of 122gb?

tacos 3 days ago 0 replies      
Writing histrionic nonsense like this does not advance your cause. Apple did not "steal" your music any more than "rm -rf" or 300 million poorly manufactured Seagate drives did.

"Matches found, press OK to delete [list of files] from your computer and phone FOREVER" is what 99% of customers want.

Debate the quality of the match algorithm (it sucks), argue the proper UX for a pretty basic function (it also sucks), but knock off the Orwellian sky is falling stuff. Because that's obvious bullshit.

Apple has its own view of the culture, and has a pretty obvious way of mining it for profit. You ain't changing them. You can perhaps however nudge them if you're not a total asshole about it.

jjuel 3 days ago 1 reply      
My real question is if you truly are a composer and making your own music why are you just storing it on your Mac? Wouldn't you want to have a NAS for that or some sort of other storage option? I would not rely on my Mac for storage of music I thought was very important. Including rare copies of songs or songs I created myself. I would have a different more reliable storage option.

However, that is really shady on Apple's part to make that mandatory, and should be looked into.

buserror 3 days ago 3 replies      
To be fair, they probably came up with that in a way to reduce the /other/ army of people coming to see in the 'genius' to restore the laptop they've never backuped EVER and have accidentally wiped, containing 10+ years of their digital life...
nurmara 3 days ago 2 replies      
I sympathize with OP and I generally really hate Apple's direction when it comes to software and user experience ever since OS X Lion.

However, OP shouldn't claim that Apple 'stole' his music files. He signed up for a paid service and he should have checked how the service will affect his computer. We as users are responsible to know what software we're running on our machines and what it does to our data. I understand that a standard TOS agreement is ridiculously long and impractical to read, and I guess that's how we are trained to be lazy about protecting the integrity of our systems and our data, but that does not give OP the right to blame Apple for delivering exactly what they promised. I hate Apple but I don't think that it is intellectually honest to claim Apple stole OP's music.

givinguflac 3 days ago 2 replies      
I read the whole thing and all I hear is waaaahhh I don't keep proper backups of data I care about, nor do I learn about the service functions before signing up... This is ABSOLUTELY NOT the way it works, it's not a shady practice, I use it daily and have lost nothing. Sorry you ran into a bug but the is just Stupid. I would be shocked if the phone agent he spoke to is a real person.
OxyContin's 12-hour problem latimes.com
502 points by sergeant3  1 day ago   330 comments top 42
cant_kant 13 hours ago 7 replies      
Sensible doctors do not believe drug company marketing.

I get large amounts of ad-junk from drug companies that ends up unread in the bin. I refuse to meet with drug company representatives. I smile politely at them if I bump into them in the corridor and suggest that they leave their ad-junk with my secretary. My staff then file their ad-junk in the trash bin.

On Friday, I had a drug company representative attempt to tell me ( he was hanging around my coffee area ) about the joys of Targin, a fixed-dose combination of oxycodone and naloxone. I gently shook him off, and directed him to my secretary.

Drug company representatives are usually decent human beings with lives and families. However they are poorly educated, poorly informed salesmen and women with sales targets to meet and product managers to keep happy. Even worse, they and the drug company have no accountability if a patient dies because of their recommendations. If avoidable death supervenes or if there are non-lethal complications or even just therapeutic failure, I am accountable.

Instead of relying on marketing, I rely on information from good, well performed randomised controlled studies published in reputable peer reviewed journals ( I like the NEJM ) and on meta-analyses of these. I view the results of these through a filter of scepticism, cynicism, pragmatism and a modicum of hope.

Many of my colleagues do likewise. I trust that you do the same in your respective vocations. Regrettably, there is a bell curve. I am sure that the drug companies find enough gullible prescribers out in the wild for their purposes.

siliconc0w 20 hours ago 3 replies      
Armchair Policy Wonk:

* Make FDA approval double blind. Can't bribe who you can't see.

* Before starting a study, if you want to include it with a future application you need to register it. All registered studies need to be included with an application. Can't cherry pick studies.

* You may only use approved claims stated in the application in marketing or product descriptions. (which I thought was already the case)

rickdale 1 day ago 0 replies      
Richard Taite of Cliffside Malibue was on Bill Maher last night and he said that there was a study that recently (within the last month) came out done by I think the FDA that concluded that oxycontin was only to be prescribed to people for 7-12 days maximum. And he said that everyone was on board, but then the prescription drug lobby totally squashed it.

The United States has 5% of the worlds population & consumes 75% of the worlds prescription drugs.[0]


KaiserPro 19 hours ago 5 replies      
There is a difference between marketing and actual use. Its a synthetic opioid, which means it has all the variability of opioids. Any decent doctor should know this, and the guides, if they were any good will indicate it.

For example codamol is now not prescribed in a hospital setting for under 5s, after one child died because they had large amounts of the enzyme required to processes it all at once. This caused respiratory arrest.

the BNF has prices: https://www.evidence.nhs.uk/formulary/bnf/current/4-central-...

and some basic advice: https://www.evidence.nhs.uk/formulary/bnf/current/4-central-...

also a big fat warning saying its a fucking opiate. If a doctor is surprised that a synthetic slow release diamorphene derivative has variable outcomes, is deeply upsetting. Its their fucking job to monitor the outcome of drugs. Its pretty much half of being a doctor.

One of the many advantages of the NHS is the chances of getting hooked on painkillers if miniscule, mainly because drugs are prescribed for the purpose of helping you, not because you've seen an advert, or you want to make a bit more money.

mwsherman 22 hours ago 4 replies      
The problems derive from the prohibition, which takes several forms. The forms add up for big profits and bad incentives, for which the consumer pays.

FDA approval is the first layer of prohibition. Very expensive, this keeps a lot of competitors out. Lots of market (read: pricing) power is conferred to the winners. Great incentive is provided for regulatory capture.

The second layer is patent protection. Again, competition becomes illegal, with outcomes similar to above.

Third is scheduling it as a prescription drug. The incentives of the drug company are now to influence doctors, in ways that may be more or less overt. Docs get a cut of the outsize profits, de facto.

The fourth layer is opiates being illegal in general. Competitors out.

Of course there are legitimate reasons for prohibition, even if I dont agree with them. One can make an argument for protecting people from harming themselves.

But these arguments are naive and static. By dynamically scoring the cost of prohibition following the incentives and the outcomes we might choose differently.

Fede_V 1 day ago 2 replies      
The best part of the article:

"Dr. Curtis Wright, who led the agencys medical review of the drug, declined to comment for this article. Shortly after OxyContins approval, he left the FDA and, within two years, was working for Purdue in new product development, according to his sworn testimony in a lawsuit a decade ago."

Says it all doesn't it?

pappyo 1 day ago 6 replies      
The pervasiveness of OC provided by doctors is amazing. Even as of a year ago, I went into the hospital for back pain and left with a script for two fist fulls of oxy. I looked at the doctor and legitimately thought he was trying to get me hooked. And this was at a world renowned hospital.

I have to wonder if Purdue is involved in some kickbacks to Dr's who (over)prescribe.

coleca 15 hours ago 0 replies      
My late father was addicted to Percocet / Percodan for much of his life after being severely wounded in the Korean War and then followed by the prostate cancer that was his ultimate demise. In the 90s, his doctor switched him to Oxycontin because of its ease of dosage (it's hard to remember when you took your last pill when you're high). Neither I nor his doctors knew that this 12 hour claim was just so that Purdue could defend its patent, despite the studies (according to this article) that pointed to it lasting far less than 12 hours.

Reading this article now helps me to better understand the pain and suffering he endured from these drugs and now give me insight into the lengths he went to for "rescue" medication, which included doctor shopping and paying retail price for more than half of his meds so that his insurance company wouldn't report him to the state or DEA (he did get caught eventually).

It boggles my mind to think how an industry whose purpose is to help people can become so corrupt and I wonder whether there is any hope of this getting better before it gets much worse. With the mainstream media playing enabler, it's doubtful it will ever happen (name the last news broadcast that wasn't wrapped end to end in big pharma advertising).

thebigspacefuck 23 hours ago 3 replies      
Really the duration can vary quite a lot. I was put under on fentanyl for having my wisdom teeth taken out and woke up in the middle of the operation in extreme pain as someone was digging my teeth out with a metal instrument. I moaned at him, through a mouth stuffed with gauze pads, to get the fuck off me and let me out of there if he wasn't giving me more meds. He proceeded to angrily accuse me of taking dope, which I had, quite a few times, but never that much, in high school a few years before. I had already explained this before the procedure and assumed they had adjusted the dose. Explaining this again, with tears of pain rolling down my face, the nurse told me it's okay and, I assume out of some attempt to comfort me, that she'd taken dope too. They gave me the fentanyl and I passed out again. My opiate use in high school was infrequent and I'd never experienced withdrawals in the slightest, though I'd say I probably took a double or triple dose a few times, mostly because it never seemed to do that much for me. I was probably 16 when I started my experimentation and it ended by the time I was 17. Freshman year of college I was suffering stomach pain and headaches, so the doc at school prescribed Reglan and a small amount of Vicodin. Painkillers for a headache? By the end of the next day I had taken them all, in total 25mg Hydrocodone spread out over a day and a half. I take ibuprofen for headaches anyway. That was probably 8-12 months before my wisdom teeth procedure, but somehow either that one time recently or the times I took opiates in high school years before were enough for the fentanyl to wear off and for me to get yelled at by a doctor for it. I can't figure out if I always had a high tolerance or if opiate tolerance never goes away. I'm looking at this chart of equivalent potency https://en.m.wikipedia.org/wiki/Equianalgesic and it seems like I probably took the equivalent of a 15 mg oxycontin at the highest dosage I ever took, 22.5mg Hydrocodone, so it's really a mystery to me why I would have an effect like that. In my opinion, the problem here is that a higher dose was recommended instead of a longer acting opioid. However, I think people should always research their medication and talk to their pharmacist. I never took the Reglan because there's a risk of Parkinsons. My friend was prescribed a huge dose of antidepressants and ended up in the hospital with serotonin syndrome. Doctor knows best is a lie when it comes to medication. Pharmacist knows best. Always ask the pharmacist.
jakobegger 21 hours ago 3 replies      
I can't believe how doctors apparently routinely prescribe addictive painkillers in the US. Here in Europe, as far as I know, doctors only prescribe non-opioid painkillers. I don't know anyone who was prescribed an opioid drug outside of an ICU.

Ibuprofen is surprisingly effective for many types of pain, I don't see why doctors immediately prescribe something as dangerous as OxyContin.

intrasight 23 hours ago 1 reply      
I had three orthopedic surgeries in 2014. Each time I got a vial of OxyContin. Figured I'd use 'em since I'm not much into pain. But the pain was not so bad. Once I tried an OxyContin to see if I could tell the difference. I could not.

A colleague tells me that they have a street price of like $40/pill. So then I'm wondering how many people know that I have a thousand dollars worth of street drugs in my house. I dropped them off at the local drug disposal facility.

morgante 1 hour ago 0 replies      
This entire article is an indictment of the gigantic market failure of medicine. Everyone is incentivized to do the wrong thing.
encoderer 22 hours ago 0 replies      
That 80mg oxy pill is a sad symbol of modern American life.

We idealize pills. We talk about a future where all food could be replaced with a pill. We attribute the starting gun of women's liberation to... THE pill. So many pills have captured our imagination and our culture: viagra, valium, adderall, prozac, mothers little helpers, morning after pills To us medicine often means pills.

We live lives that are too busy with artificial constraints, without enough time connected to relationships and leisure and nature. We live accelerated lives and have no time to be unhealthy. A pill that you take home and take just once every twelve hours -- at the start of your work day and the end -- is the only kind of cure we have time for.

There is a reason Americans consume the vast majority of pills. Far far more than other rich countries.

heisenbit 23 hours ago 1 reply      
A perfect storm:

- Corporate misinformation pushing a significant set of users on a repeated and addiction forming schedule.

- Morphine (one of the proven and cheaper alternatives) perceived as problematic / cancer drug

- Doctors having less hassle and financial incentive to prescribe OC on q12 schedule than alternatives

- shame on user side and war on drugs policies that make it hard to reach out for help

The good news is that the internet makes it harder to spin misinformation over the long run.

auggierose 1 day ago 1 reply      
It seems lots of pain was inflicted on many people because of the patent system. Usually drugs are upheld as the prime example of why patent systems are necessary. This seems more than questionable, given this example. Combined with the fact that companies have no conscience, it would be nice to have better ways to do expensive research than a) in companies and b) for patents.
pakled_engineer 22 hours ago 2 replies      
Another problem is all the anti abuse so called safeguards like timed release so the pills can't be crushed. That means you can't take half the pill and a few hours later the other half which is what some people did to get around the false 12hr relief advertising before the DEA and other government meddlers got involved.

As for the FDA guy materializing at Perdue this is par for the course of all gov positions. A cabinet minister here was instrumental in vetoing the Bank Act which saved banks millions in taxes. Of course that minister materialized on a bank's board of directors after being kicked out of office by the voters as a thank you. When that bank started making large party donations they appointed the same ex minister as dean of a local university where immediately upon being parachuted into the role credit card tables were set up by the same bank on campus to rope students into applying for them. Plenty of city employees and councillors that allowed unpopular developments ended up on the board of developer corps too only later to be appointed to special environment advisory committees that of course pushed for pro development regulations. I'm sure this ex FDA pharma shill will end up somewhere else so he can further manipulate the system to the benefit of his patron too.

potatosareok 1 day ago 0 replies      
It's pretty depressing how casually people's lives are thrown away for money. I can't imagine how angry I'd be if I was dealing with a family member who was prescribed these increasingly higher doses of Oxycotin and going to doctor who refused to prescribe anything else because Purdue repts insisted it was safe and effective at 12 hour doses.

Guess the jokes on us though the Sackler family gets knighted and modern institutions like MoMA let them slap their name on buildings for $$$ so who's really any better.

Dowwie 1 day ago 0 replies      
Given this evidence, will physicians who increased dosages rather than frequencies lose their licenses -- as they ought to? Physicians decide what is best for their patients. Choosing what is best for a company rather than the patient is a serious breach of ethics.
MichaelGG 16 hours ago 0 replies      
So the guy suing claims he was on 400mg a day. OC doesn't come in 200mg, right? So he must have been prescribed what, 80mg * 5? Or even more pills to take twice daily, so he could have spread the doses. At that point, what's the complaint with or point of OC?

Anyways, this just highlights the issue with the DEA. People should be able to fairly freely get pain medications. It's their choice and they should not have their freedom curtailed by professional gatekeepers and bureaucrats.

JimboOmega 21 hours ago 0 replies      
And heroin was marketed as a "non-addictive morphine substitute"... we know how well that turned out.

Of course that started in the 19th century so we're not so easily fooled any more, right?

cauterized 12 hours ago 0 replies      
The article seems to suggest (but doesn't outright state) that undergoing withdrawal accelerates the development of addiction. Is that a known effect with opiates?
Synaesthesia 1 day ago 0 replies      
It's pretty crazy that the corporations can market to doctors with outright lies like this. Complete arrogance and ignorance of the complexities of drug interactions.
pstuart 23 hours ago 1 reply      
It will be interesting to see how well CBD (Cannabidiol) based medications work once the doors open to prescribed usage on a national scale.
allisthemoist 23 hours ago 0 replies      
> The U.S. Justice Dept. launched a criminal investigation, and in 2007 the company and three top executives pleaded guilty to fraud for downplaying OxyContins risk of addiction. Purdue and the executives were ordered to pay $635 million.

A mere fine for ruining the lives of hundreds of thousands, if not millions of people.

sgnelson 16 hours ago 1 reply      
Does anyone know what caused the sales the sales (The chart: "Oxycontin Sales 1996 - 2014)" took a major dip in the years 2005 and 2006 and then rebounded like crazy in 2008?
c3534l 17 hours ago 0 replies      
In other words, OxyContin is basically like any other opiate painkiller and it's problems come primarily from people misusing the drug, which is why it's prescription in the first place.
nkrisc 23 hours ago 0 replies      
Even though companies like Purdue deal in the realm of health and wellbeing, they have no interest in either, only in profit. Profit that may come at the expense of health and wellbeing, while appearing otherwise.
OliverJones 18 hours ago 1 reply      
I wonder if the accelerometer in a smartwatch could measure levels of pain in a patient? I wonder if tense muscles and other aspects of suffering could be measured and tracked.

I suppose a smartphone app could ask the patient to tap one of the ten pain-level smiley - frowny - sweaty faces at intervals as well.

If so, there'd be a way to measure how fast these painkillers clear from each patient. It might even make it possible for some patients safely to play an active role in their own dosing. Avoiding wide swings in blood concentration is the safest way to go.

fapjacks 3 hours ago 0 replies      
Just a suggestion for anyone stuck on opiate painkillers that wants out: Kratom will help you come off of it.
stuaxo 22 hours ago 1 reply      
Great article, could do with editing down about 50%
guelo 22 hours ago 0 replies      
I'm more alarmed by the incompetence of the doctors. If someone like Microsoft started heavy pressure marketing a developer tool that did not work as advertised programmers wouldn't need the help of the FDA to tell Microsoft to fuck off. But doctors don't seem to understand or care what they're doing and just follow whatever the latest expensive marketing campaign tells them to do. It's scary.
yarou 21 hours ago 0 replies      
Oxycodone is an interesting pharmaceutical in that it is synthesized from thebaine instead of morphine or codeine.

The problem with opiates is that by the time you get to a dosage that's therapeutically effective, you'll end up building massive amounts of tolerance to the drug. I'm sure pharma companies are well aware of this fact. Strange how we as a society use opiates for pain management rather than cannibanoids like CBD or CBN.

a3n 16 hours ago 0 replies      
OxyContin: respectable heroin.
davesque 21 hours ago 0 replies      
For profit medicine at its best.
guard-of-terra 1 day ago 1 reply      
Why don't patients for whom OxyContin does not work 12-hours ask their doctor for some another drug? With smaller interval, generic, and much cheaper?

Would a doctor refuse to prescribe different, much cheaper drug, keeping them on expensive drug that does not work? Should not they then lose licenses?

The whole prescription medicine system always seemed evil to me.

DanielBMarkham 1 day ago 1 reply      
If this is true, this is one of the biggest stories of the year so far. There will be a huge class action lawsuit. Wow. It boggles the mind the number of people harmed here -- if this is true.

I wonder, for those people who do not get the full 12-hour effect, are they eliminating the drug quicker or metabolizing it into its bioactive agent quicker? It could be that not only are they going into withdrawal, they're actually experiencing more of an effect than they are supposed to. (I do not know. It'd be interesting for somebody who knew something to chime in here.)

nefitty 22 hours ago 2 replies      
Just to be clear, Maher has an extreme distrust of modern medicine, to the point that it led me to almost totally abandon his show. For example, he gave the fraud Sam Chachoua an inordinate amount of airtime for someone claiming he can cure HIV with goat's milk. As far as I know Maher is still an antivaxxer, as well. http://rationalwiki.org/wiki/Bill_Maher#Medicine

Anyways, I've realized to take Maher's views and endorsements regarding the medical field with a grain of salt.

gregd 1 day ago 1 reply      
They've got the money to pursue and settle potential court cases throughout the country. It doesn't surprise me that they have the money to hire people to troll the internet and shill for the company.
dang 14 hours ago 7 replies      
We detached this subthread from https://news.ycombinator.com/item?id=11649541 and marked it off-topic.
puppetmaster3 1 day ago 8 replies      
It's quite simple: Don't do drugs.

Next question please.

modanq 19 hours ago 0 replies      
ER Resident Physician here, we are many years in from The Joint Commission's administrators claiming that pain is the "5th vital sign" and mow feeling the effects of this disastrous policy. My ED is filled with drug seekers; studies have shown up to 20% ED visits may be Drug-Seeking Behavior (DSB). Really demoralizes physicians who want to help not only these people but those in the overflowing waiting rooms.
wapapaloobop 1 day ago 1 reply      
I haven't taken OxyContin but withdrawal symptoms and the concept of hell are interesting (well, afterwards at least).

It's like the mind has a thought, then checks to see how the body reacts (i.e. with joy or fear) and this in turn affects how the thought is interpreted, which then influences what is the next thought, etc.

The oxycontin or whatever temporarily cuts off this mind-body connection (without paralysis!) but eventually the fear comes back in a downward spiral. Even the appearance of everyday objects can then seem threatening and disturbing.

The void left by the parallel port medium.com
490 points by OakNinja  2 days ago   142 comments top 27
billforsternz 2 days ago 3 replies      
The parallel port was great for simple bit-bash interfacing and I miss being able to do such hacking on contemporary PCs. I remember fondly a hacking war story from the late 80s. I had just purchased a Mephisto 68000 dedicated chess computer for my own recreational enjoyment. The embedded dev lab where I worked won a new contract and the hardware dudes decided a 68000 was the way to go. Not content to wait weeks, maybe months for first hardware, I decided to get some early experience by hacking at my chess computer. Basically I opened it up, reverse engineered the keyboard scanning hardware (which was implemented with standard jellybean CMOS logic), and commandeered a couple of inputs and outputs for my own purposes. I hooked these up to the printer port of my PC.

Then I worked out a simple serial bit twiddling scheme to exchange bytes a bit at a time and coded it up in C as a kind of software serial port. On the PC side I ended up with something that looked just like a simple terminal that happened to talk out the printer port. On the Mephisto side, I replaced the 64K byte EPROM with a 128K byte EPROM. I changed the cold boot vector to point at "my" 64K, which first checked to see if the printer port was hooked up. If it was hooked up, then the CPU stayed in "my" area and ran a monitor I had coded up for the occasion. If it wasn't hooked up it vectored back to the original boot code and the chess computer worked as well as it ever did. I layered a loader on top of everything and enjoyed loading simple cross compiled C programs on my new 68K computer, (complete with 16K RAM).

kazinator 2 days ago 5 replies      
The tinkering void left by the parallel port is pretty much filled by microcontroller-based systems-on-a-chip that are smaller than the plastic casing of a parallel connector.

Progress giveth, progress taketh away.

neckro23 2 days ago 5 replies      
I found out about the CNC thing recently while trying to get a hobbyist CNC machine set up. My immediate reaction was "Parallel ports? What is this, 1985?" Turns out, there isn't really any getting around it. It's parallel port or GTFO.

This is how I also learned that dirt-cheap PC motherboards still have parallel ports built in. The higher-end ones generally don't.

zw123456 2 days ago 2 replies      
And don't forget about the venerable FTDI FT232R chip which has the very useful "Bit Bang" mode http://www.ftdichip.com/Support/Documents/AppNotes/AN_232R-0...I use those FT232R puppies a lot for all kinds of things that the old PP was good for, programming FPGA's, testing out a new chip or circuit before going to MCU etc. It is very simple and inexpensive, and much faster than the PP was.
ryan-c 2 days ago 1 reply      
This reminds me of junior high and high school, making PC peripherals that used the parallel port based on schematics I found on the internet. It was a lot of fun.

I remember building a link cable for my TI-89, NES and SNES controller adapters, a programming cable for a radio shack universal remote, AVR programmers, and other things.

zorpner 2 days ago 4 replies      
What we really need to bring back is the BeBox Geekport: http://www.hardwarebook.info/GeekPort
kirrent 2 days ago 1 reply      
I think this article explains pretty well why we don't need a parallel port anymore. Rather than outfitting every computer with a complicated and slow port we can have faster, smaller, and more flexible solutions using extra processors in our devices. While I appreciate the ability to control so many IO lines directly I just don't really see the use case anymore and this comes from someone who went down the whole design a new CNC controller because the old one used parallel. No-one ever designs new hardware anymore and bemoans the lack of parallel port that would make their job easier. The void is only legacy hardware that is difficult to use these days.
Aqwis 2 days ago 3 replies      
So why do those "cheap USB to parallel port converters" only work with printers?
justinlardinois 6 hours ago 0 replies      
Definitely an interesting story, and I'm loving the other ones that come up in this comment thread.

But I have to say the article name is a little overbearing. USB serves the same purpose parallel ports originally did plus more. Yes, devices designed for the parallel port might have trouble playing with modern hardware, but you're just as likely if not more to have drivers that don't run on modern operating systems and are closed source so you can't do anything about it.

sdk77 2 days ago 0 replies      
I fondly remember a university project where we made an mp3 player - hardware based around an Intermetall (MAS) chip and the software to drive it. This was late 90's and the run of the mill 486 struggled to play 256kbps mp3's. I designed an interface around a PAL chip that received nibbles from the parallel port and serially sent them to the decoder chip. The program to parse the mp3 and bit-bang the parallel port I wrote in Turbo Pascal, and it actually did an amazing job, up to 320kbps mp3 could be played. Fun fact: the PAL originally couldn't quite keep up, so I had to increase the supply voltage (as well as the clock voltage level) and attach a heatsink to it!
cortesoft 2 days ago 3 replies      
What about just installing a parallel port on your computer? http://www.amazon.com/StarTech-com-Express-Profile-Parallel-...
GnarfGnarf 2 days ago 0 replies      
The parallel port was really neat. I once connected my furnace to a PC, to track when the thermostats were opening the hot-water valves.

My house has hot-water heating (as opposed to electric or hot air). I would occasionally observe the furnace coming on for a few seconds, then shutting down. I suspected a malfunction. So I connected the 24 v. valves that control the flow of hot water to the radiators, to relays. Each relay opened/closed a pin on the parallel port. I wrote a program to record the status of the parallel port.

I was able to determine that the furnace was coming on at the very end of a heat cycle, just when the room had reached temperature. The furnace would kick in just before the room stopped asking for heat. So it was OK after all.

digi_owl 2 days ago 1 reply      
One of the most intriguing uses of a parallel port i have seen was back in the early days of digital sat coding.

It was a crude PCB, some wires, and a old IBM PC.

The PCB was shaped such that it fit the receivers card reader, along it was 4 lanes going from the contact pads to the other end that protruded from the reader, from there went 4 wires that was poked into specific holes in the parallel port, and then a program was fired up on the PC that did the decoding.

jpm_sd 2 days ago 3 replies      
This is a cool hack, but I think the article would benefit from a deeper explanation of why "USB to serial" worked and "USB to parallel" would not.
byuu 2 days ago 3 replies      
Props to innovating a solution. But since this is SNES related ...

I've been working with people over the years on an SNES<>PC interface. Idea being, you could dump SNES games from the cartridge port to the PC. Or you could upload hardware diagnostic/test programs (for emulator development) from the PC to the SNES, and then read back the results to the PC for analysis. Or just do crazy stuff like linking multiple SNES units together, support netplay on real hardware, download an unlimited amount of game level content, mix in audio from the PC via SNES commands, etc.

Here's the history: http://i.imgur.com/ucxVbJa.jpg

We started with the SNES controller port and used bit-banging code to a MAX232N, and connected the other serial end to a PC serial<>USB adapter. This worked out to around ~4KiB/s that we could transfer.

After that, I moved to a TTL-232R-5V cable, and spliced it directly to an SNES controller. Same speed, but a lot less bulky.

The speed killer was having to bit-bang everything. The SNES has DRAM refresh cycles that stall out the CPU for a short duration on every scanline, so that greatly impedes the maximum baud rate you can bit-bang at.

Then I found out about the Teensy, which has a synchronous UART mode. Just connect another line that you strobe when new data is available on your pin. They have software drivers that simulate USB serial UART, so you just compile that in and flash it onto the chip. This got us up to around ~30KiB/s, but that was still rather slow.

Finally, we moved on to targeting the expansion port on the bottom of the SNES. It exposes the entire 8-bit data bus. And so we connected that to an FT232H board, which has built-in 1024-byte smoothing buffers in each direction. From the SNES' perspective, it's just like a parallel port. From the PC's side, it's just like a virtual COM port. This lets us hit ~160KiB/s in both directions, and that's with code to verify data is available/can be sent from both ends of the connection on every single byte transferred either way.

The device itself is actually USB high-speed, so we could max out the SNES' 2.68MiB/s transfer rate, but we would lose the ability to ensure data or buffer space was available during DMA transfers, so it wasn't as reliable to do that.


Moving on to the Super Wild Card ... that thing is just a relic. It's unable to dump games that use the SA-1 coprocessor (Mario RPG, Kirby Super Star, etc), or memory map controllers (Star Ocean, Tengai Makyou Zero, etc), making it quite limited.

The device we've built can dump games over a USB interface. Takes about ~20 seconds a game, and can dump absolutely every game in the library. And the dumper is written in pure C/C++. About 200 lines of code to support every possible cartridge type.

When it comes to playing games, the SWC can't play anything with any coprocessors (all the aforementioned games; plus DSP games like Super Mario Kart, Pilotwings, Top Gear 3000, etc) [well, some of these copiers supported DSP-1 pass-through, not sure if the original SWC supported that or not]. On this front, the sd2snes is a vastly superior solution. You can put the entire SNES game collection onto a micro SD card, and run anything the SWC can, plus the DSP games as well.

All the same, even if it's not the optimal solution, I still give props to the author of this hack. It's definitely cool to restore these old copiers, if just for the novelty factor. I'm reminded of a guy who built his own SWC DX2 CD-ROM drive interface for loading games, because the CD drives are much less common.

rocky1138 2 days ago 0 replies      
I'm having the exact same problem with my Atari Jaguar, modified to use BJL (Behind Jaggy Lines), which is a replacement ROM chip that lets you upload games directly to the Jaguar's RAM using a Parallel cable plugged into the second controller port. New PCs don't have a Parallel port and the USB adaptors don't work :(

I've been considering trying to build something using my Raspberry Pi or get into Arduinos to try and fix it, but I'm not really capable with hardware stuff.


fit2rule 2 days ago 1 reply      
I'm really bummed at how quickly perfectly working and valid technologies just get discarded when the new hot stuff comes out .. take, for example, SCSI. SCSI was ubiquitous in computing, its everywhere. I have a 19" case full of perfectly working drives.

But can I, for the life of me, find a modern SCSI adapter with ease these days - say, USB-SCSI? Nope. Its just been discarded like yesterdays cheese.

Same goes for the Parallel port. What used to be a perfectly cromulant means of wiring up some I/O for quick hacking has now been supplanted by a drawer full of Bus Pirates ..


IvyMike 2 days ago 0 replies      
One of the coolest class projects I worked on in school back in the mid 90's was a compiler for TTL_PAPERS, which was a hardware-assisted low-latency interconnect between PCs used for barrier synchronization. It was also correspondingly low bandwidth, but for the cases where synchronization was your bottleneck, it was hard to beat it.

More info from the inventor (Hank Dietz) here: http://aggregate.org/TechPub/ICPP95/icpp95.html

protomyth 2 days ago 1 reply      
I miss the 9-pin joystick ports more than the parallel ports. I hooked a photocell to the paddle pins on one and had a huge amount of fun. I remember Be had a huge port on the BeBox for hacking.
jheriko 2 days ago 0 replies      
i love stories like this.

kudos to the author for taking it all the way to the finish line.

philjohn 2 days ago 1 reply      
I have a rather faded memory of using a parallel port to transfer files between our old 386 computer and the shiny, new, Pentium that replaced it ... much faster than using floppy disks and this was before everything came with an ethernet port!
xirdstl 2 days ago 0 replies      
Ah, the pre-USB interfaces. The most fun I had was figuring out how to read the input from an old shipping scale which had serial port output. That powered my keg monitoring server, which I eventually wrote mobile apps for. It was the best side project I've worked on!
raimue 2 days ago 0 replies      
Now add a SD card slot and you could load all games you ever owned from it without a computer.
paddi91 2 days ago 0 replies      
Whoever is interested in buying one: https://aisler.net/tltx/swc_usb/swc_usb
beeswax 2 days ago 0 replies      
Man we were proud when we soldered our own Covox DACs for the LPT; Our MODs sure sounded awesome on those :)

And I didn't know DOSBox has an emulation for those, nice!

gaze 1 day ago 0 replies      
I feel like PCI and ISA have left holes as well. PCI/e is a fiendishly difficult protocol
garyrichardson 2 days ago 0 replies      
I thought this was going to be about the space where a parallel port used to be on the back of your computer. Was I disappointed.
Inside Palantir buzzfeed.com
548 points by JimDash2145  1 day ago   292 comments top 40
callmeed 1 day ago 18 replies      
My understanding is that Palantir is running an engineering mill/sweatshop using fresh out of school CS grads (particularly from Stanford). That much of their time is spent simply writing client-specific code that imports and cleans up data from disparate enterprise systems. That its not challenging, innovating, or exciting.

Of course this is all anecdotal based on what I've heard from friends and a Palantir engineer I met at bar.

Anyone with more insight know if this is accurate?

willalden 1 day ago 2 replies      
Hi all -- I wrote the Palantir article. It's really awesome to see all this discussion about it. As I say in the post, please don't hesitate to contact me if you'd like to chat confidentially. I am always eager to hear any tips or new information. Find me on WhatsApp, Telegram, Signal, or encrypted email. Contact info here:


vonnik 1 day ago 3 replies      
This article is much more interesting than its fairly predictable headline implies. While the secrecy of Palantir has served as clickbait for many years, there's actually real news in this.

And the news is about the difficulties of scaling a services-heavy, on-prem software company that basically rents out forward-deployed engineers at 10s of millions of dollars per year. Especially when the software is an open-source stack, and the engineers are increasingly junior as the companies grows. That said, it's still great at sales. Big numbers there...

jbob2000 1 day ago 4 replies      
I want to see what Palantir actually produces. Like does Coca-Cola get a monthly report that says "Hey, you guys should bring back Cherry Cola in Montana for an expected 4% increase in sales" or what? What does $1mil/month actually get you?
jzwinck 1 day ago 0 replies      
"Those anxieties come amid a wave of staff departures. A chart from Palantirs internal wiki said the departures through mid-April amounted to 5.8% of all staff, or an annualized rate of 20%. That compares to a departure rate of 13.6% in 2015, 12.2% in 2014, and 9.2% in 2013. Palantir paid annual bonuses in March...."

Many companies have waves of departures in the spring. Bonuses are paid (as at Palantir), holidays are over, kids are about to finishschool. A 5.8% departure rate in the beginning of the year cannot be "annualized" any more than fruitcake sales from December. And if it does end up that 20% of Palantir quits in 2016, that'd be totally normal attrition for a large company. Where I used to work it was 30% among software engineers, and even this was not a problem.

I imagine that a number of ex-employees go on to work in the industry whose data they analyzed at Palantir. This may help to explain why employees are willing to work for less than elsewhere. It's because three years later they will work for much more elsewhere. Especially the hedge fund analysts.

danso 1 day ago 0 replies      
FWIW, Gawker reporters have been making FOIA requests of various agencies about their contracts with Palantir...you can see the responses on MuckRock:


Many of those requests came back empty, here's one that produced responsive documents: the NYC Department of Finance -- $150,000 for a 6-month pilot of their "Perpetual Server Core" technology to do fraud detection:


edit: specified that the requesters appear to be Gawker reporters doing their own investigation

swingbridge 1 day ago 0 replies      
Saw the aftermath of a Palantir project after the client parted ways with them. What a train wreck. There was a lot of hype around them but when one looked deep into what they actually did it was a lot of smoke and mirrors and not a lot of substance. Even basic stuff like data cleaning and integration was poorly executed. Senior leadership didn't see value in what Panintir did, the project was cut off and they were asked to leave.

Not surprised in the slightest to hear they are struggling.

bane 1 day ago 1 reply      
I don't think this is all that surprising. If you look at their staffing numbers and their fundraises, they appear to not be sustaining business but just growing staff because that's what startups do.

> Lisa Gordon, said that the majority of the companys customer relationships are multiple years in length, and many are as long as 10 years.

This is not good at all. Customer loyalty is important, but considering that they've been adding staff at a fast rate, most of their customers should be new-ish.

They've raised about $2.5b over an incredible number of rounds (multiple per year), and are currently bringing in about $420m.

Based on my understanding, they message as a software seller, but appear to make most of that revenue off of consulting and integration services tied around their software lock-in. They also message as a big-data company, but AFAIK don't provide anything that would be called "big data solutions" these days.

edit: I thought Sankar's name sounded familiar, turns out he was the guy at the very heart of Palantir's very embarrassing industrial espionage and racketeering efforts against a competitor. He was apparently punished by being promoted to company president.


tlogan 1 day ago 1 reply      
The problem with Palantir is the following:

Three letter agencies are paying and will continue to pay for Palantir consulting and products.

However Palantir has been unable to fully productize their solution (it is still pretty much consulting). Thus they have hard time convincing Fortune 500 companies to pay due to the costs and depending too much on human interaction.

eachro 1 day ago 1 reply      
I don't see this as particularly damning to Palantir. From what I understand, Palantir is really like a McKinsey/BCG/Bain that specializes in data rich projects. If you view Palantir as another consulting firm, I'd be curious as to how its rates and deliverables compare to the that of MBB. My uneducated guess is that what Palantir is offering for its billing rate is probably in line with the standard for consulting firms.
kwisatzh 1 day ago 1 reply      
The article hints at a larger problem that underlies all data-science-as-a-service outfits: pricing for profitability. How can you price the generation of insights so that you; Palantir etc. can become profitable, while those insights can save on costs for the respective clients? Coke would need to generate $18M+ per year from the insights alone to justify the costs.
l33tbro 1 day ago 2 replies      
Confusing company on many levels. On one hand such huge clients, top talent (until recently), and Thiel's famous success pre-requisite of having unique offering. But then ... they go and do really, really weird stuff like making recruitment videos shot on someone's Iphone:https://www.youtube.com/watch?v=PhMqPoCQ5Q8

I think for a long time they got away with seeming really cloudy and potentially mis-managed because of the secrecy that is inherent to what they do. But clients are now asking tough questions about what value they can actually bring to the table for crazy dollars, and things may becoming home to roost.

zekevermillion 1 day ago 0 replies      
Spinning the reported facts another way, Palantir has doubled revs over last year, and could turn a profit at will. Meanwhile it is able to set price optimally at verge of pain point for some of the largest enterprise customers. Sounds like a well-run company.
palandick 1 day ago 1 reply      
There seems to be a lot of confusion here so let me clear up some things from my former several years at Palantir.

1) Palantir's product is a legitimate product that works quite well for many graph analysis use cases. If you can model your problem as a graph, chances are that Palantir will help you find some solid insights (if there are insights to be found) across your 15 disparate datasets. Software Engineers build the core platform.

2) Forward Deployed Engineers customize and extend the platform for specific client use cases and get client data into Palantir. If these customizations end up being useful, they get rolled into the core platform(ideally).

3) Palantir works their engineers very hard and the performance bar is very high. Calling it an engineering mill/sweatshop is a bit much considering you get 3 meals/day, massages, chiropractic, nap rooms, excellent events, laundry, etc and 6 figure salaries or massive stock or both.

4) Lots of people have been leaving. Part of this is because lots of oldies are hitting their 10 year marks and part of it is because Palantir went through a massive growth spurt over the last 4 years or so and now the fat is being trimmed/people are leaving after concluding that it wasn't a good fit/they don't want to work that much. If you can get into Palantir, you can get into Google: Why not go to Google?

5) If you want to see the actual product, just look around. Here are like 150 examples: https://www.youtube.com/playlist?list=PLCA98B156F7EFD6A0

6) The bookings vs revenue thing is pretty deceptive but everyone knows that the money isn't real until its in your bank account or there's a signed contract for it to be in your account.

Lxr 1 day ago 4 replies      
> One of the things we did well early on was to recognize and invest in the unique talents of each Palantirian

I find it a bit weird how tech companies make up names for their employees now. Googler, Palantirian... does this make people work harder because they feel like they belong or something?

misuba 1 day ago 1 reply      
At least they have the decency to tell you who they work for right in their name.
pfarnsworth 1 day ago 1 reply      
I'm actually surprised that Palatir paid below market rates. I heard they paid well above market rates, which justified their stance of never going public.
insulanian 1 day ago 3 replies      
Whatever you do, please don't discontinue Plottable.js
AndrewKemendo 12 hours ago 0 replies      
I used Palantir heavily in my previous work and it was disappointing at how manual it was based on what we were sold as a mostly automated platform.

It had a great interface for creating network diagrams, and collaborating but it was a huge pain to get integrated.

I mean if nothing else, give us keyword matching and linking!

bkjelden 1 day ago 1 reply      
20% annualized turnover does not seem alarming for a company in the valley that hires mostly fresh college grads.

That implies an average length of tenure of 5 years. Very few college grads that I know stay at their first job for 5 years.

rfrey 1 day ago 1 reply      
$1 million/month. I think that would make even patio11 blush. :)
gaius 1 day ago 0 replies      
From a linked article:

"The primary payday for the best engineers is that you get to work with the best engineers,"

Funny how that's not the same for execs!

404throwaway 23 hours ago 0 replies      
Palantir is probably worth less than the money invested. At $420 million in revenue, they would need to be worth more than 6x revenue to match the $2.5 billion invested. They certainly have "special sauce" over and above their value as a consulting firm (which are worth maybe 2x revenue).

But how big is the market for that "special sauce"?

The intelligence customers are low-capability, have massive datasets and really do need Palantir. But they've tapped out that market. So look at the consumer brands in their customer list: low capability, yes. But you could fit the data for any of them into the RAM on one server.

Yes Palantir has super smart guys who can find fascinating relationships in the data. But there are only so many relationhips to find. And once that's done, the IT staff of the customer can do the work easily themselves.

So I'm sorry but I think Palantir is a washout (worth less than the money invested). Nice offices though.

davej 1 day ago 0 replies      
To get an idea of the kind of stuff Palantir do then it's worth taking a look at the presentation that was leaked during the HBGary leaks: http://www.businessinsider.com/palantir-wikileaks-2011-2?IR=...
ktamura 1 day ago 1 reply      
"The company, based in Palo Alto, California, is essentially a hybrid software and consulting firm, placing what it calls forward deployed engineers on-site at client offices."

This says everything about Palantir, and depending on your perspective, their valuation is (not) justified.

It's justified if you think of the equation Palantir = McKinsey + IBM. On the other hand, no consulting firm should have 20x revenue multiple.

marmaduke 1 day ago 2 replies      
How can they ask for such high prices?
jgalt212 1 day ago 0 replies      
That picture of Alex Karp is amazing.

If I ever get too rich to never ever give a f*ck, I will dress like that.


dccoolgai 1 day ago 0 replies      
Interviewed with them once: took like 2 months and it was a horrible experience. Weirdest thing was I randomly ran into this guy onsite that was a friend-of-a-friend.. and enough of an acquaintance that we had each others' numbers and went drinking a couple times... saw him there wearing the same Palantir t-shirt as everyone else (which was weird) and he barely acknowledged me... like I wasn't "one of them" yet... it was creepy enough that I was glad I didn't get an offer.
gzur 1 day ago 0 replies      
The headline graphic really caught my attention, because my fingers have ls -latr ingrained into them.

I thought it oddly fitting.

Analemma_ 1 day ago 2 replies      
Apparently they're not doing well. Good riddance. Palantir is creepy as hell. For all the snark about how startups and places like Apple are run like cults, like in The Circle, Palantir is the one outfit I know of where the stereotype is really true. And in the post-Snowden world where we know for sure just how all-devouring the US intelligence's community's desires are regarding our privacy, the last thing we need is Silicon Vally outfits helping them out. Assuming Karp can't right the ship again, I will not be sorry to see them go.
ginger_beer_m 1 day ago 0 replies      
> Were looking to do transformational work with our customers, Gavin Hood, Palantirs chief of staff, said in an interview with BuzzFeed News. Finding the right partner to do that transformational work takes a lot of care and a lot of attention. He added, Theres a lot of reasons why that doesnt always work out.

Sure it's hard to find suckers who will give you million of dollars for poorly defined return.

taytus 1 day ago 1 reply      
"Although the company is not profitable"... Excuse me but, you have to be fucking kidding me!!
forrestthewoods 1 day ago 1 reply      
I know a few people who have worked for Palantir. They all agree it doesn't work. They all quit because Palantir data analysis is being used to ruin lives (throw people in jail) and they believe the whole operation to be a giant defrauding of the government.
untilHellbanned 1 day ago 0 replies      
Theranos, Zenefits, palantir, what's the next unicorn deflation? Is it Uber or Spotify with their Palantir-esque spend a $1/make $0.90 strategies or is it an another company with less-reported vulnerabilities?
sharkweek 1 day ago 1 reply      
Will Alden is shaping up to be one of the most connected journalists in startupland. First the Zenefits coverage, and now this, he's done a great job getting lots of great sources at some of the more interesting companies.

Credit to BuzzFeed for building up a pretty solid news team.

CydeWeys 1 day ago 4 replies      
> If you have information or tips, you can contact this reporter over an encrypted chat service such as Telegram, Signal, or WhatsApp, at 310-617-1302. You can also send an encrypted email to will.alden@buzzfeed.com, using the PGP key found here.

This is the best "contact me if you have juicy insider info" slug that I have ever seen. He can take PGP-encrypted email!

And this is on Buzzfeed!!!

leroy_masochist 1 day ago 2 replies      
This article was way, way better sourced than I thought it'd be. Especially for Buzzfeed of all places.
hackaflocka 1 day ago 0 replies      
There are a lot of similarities to Theranos.
dschiptsov 1 day ago 1 reply      
SAP-like scam. Highly refined, Oxford-educated sales and execs, selling with high-status deceptive techniques grossly overpriced, lovest quality outdated by a decade crap, for support and maintenance of which they are billing their victims for each hour spend by expensive suit wearing nonsense talking consultants.

BTW, the guys who went there for sweatshop positions are those who barely managed to graduate. But in sales there are top tier liberal arts guys, with refined speech and behavior.

These kind of enterprises is a classic social pyramid, modeled after an organized religion, especially Catholic Church, with exactly the same deceptions, dogmas and loyalty for lower ranks.

ZanyProgrammer 1 day ago 2 replies      
I'm curious if any positions require pre employment drug tests, random urinalysis, or security clearances. What a miserable life for a 20 something in the private sector!
30 years later, QBasic is still the best nicolasbize.com
489 points by bubblicious  3 days ago   290 comments top 73
jbandela1 2 days ago 7 replies      
For young children, QBasic and goto may indeed be superior to more structured languages. From a neuro development perspective we know that young children are very concrete and have limited abstraction ability. With Basic and goto, a program is more concrete (you run a line which is more or less 1 command) and linear (no nested structures). With structured programming, it is more abstract, and has an recursive structure (with arbitrary levels of nesting).

Because of this, it might be preferable to introduce a child to the more linear, concrete programming model with goto and let them experience the thrill of programming then as they develop more and their programs increase in complexity, introduce them later to other more abstract methods and languages such as structured programming, oop, and functional.

jasode 2 days ago 6 replies      
>to realize that in more than 30 years, we have not been able to come up with something better for our kids: Qbasic ..., but we have never really made a simpler or more direct access to the thrill of programming than QBasic.

First, I think it's really cool that he's exposing a young mind to programming. But I'm also old enough to have installed QBasic from floppy disks and part of me thinks his statements I quoted are romanticizing QBasic a little bit.

I think that Javascript in today's browser is a fine substitute. I've had good experience with children that age by going to the browser, pressing F12, go to Console tab and start typing code. It can start as simple as:

 alert("hello world");
And boom, you get a popup. You can also show the kid he can type in arithmetic stuff like "2+3" and he'll get back the answer 5. You can then show him how to modify the existing web page. You can ramp up a slight bit of complexity by showing how to create a text file and writing Javascript and then having the browser run it. The 8-year olds I've seen can handle this no problem.

What I like about the Javascript-for-kids approach is that it shows them that programming isn't some other universe where you install QBasic in a vm. Instead, the initiation into programming/experimentation is just 2 keystrokes away. There's something about the immediacy of F12-Javascript that keeps it from being an esoteric dark corner. The kid can also get more mileage out of his "programming" knowledge because he can use his Javascript console tricks at his friend's house on any web browser. On the other hand, playing with QBasic today is more isolating. The use of QBasic in 1980s had more utility because the syntax of '10 PRINT "HELLO"' also worked on Apple II, Commodore 64, Radio Shack TRS-80, Texas Instruments TI-99, etc.

ilitirit 2 days ago 2 replies      
I don't think QBasic's mainstream popularity can be understated. Googling "gorillas.bas" gives you a predictable result. I suppose it can be argued that Gorillas and Nibbles are partly responsible for QBasic's popularity (is there a lesson in here somewhere?).

However, like many others, QBasic was not my first experience with a BASIC language. Prior to that was GW-BASIC on PC (it was part of our introductory course to Computers at high school), and before that was Sinclair BASIC, which I learned in part from a book called "Peek, Poke, Byte and RAM"


But my first encounter with a programming language was Logo:


I'm sure there was a BBC Kids show that featured Logo programming (yes, they drew shapes with the turtle), but I can't remember the name. What I liked about Logo was that, as a child, it seemed very intuitive, and I felt like that even without having touched a computer I knew what programming was about, which is a lot more than I can say for any other language I've worked with.

nickpsecurity 2 days ago 4 replies      
For anyone that misses QBASIC, there's FreeBASIC that's a lot like it & easy to install:


After loosing my memory, I have to relearn programming nearly from scratch. I took a challenge to do specific program in around an hour, from language to toolchain to solution. Had to be system-like language. Got bogged down by toolchains, IDE's, whatever. Said screw it: why the hell is it so hard when I faintly recall starting lightening-fast on QBASIC? Found a BASIC, FreeBASIC, that was just like it with simple command to launch stuff made in text editor and good documentation. Maybe half an hour of my challenge gone.

Next, time to apply lessons I remembered from high-assurance and Wirth. First, subset the language to minimum necessary. Used docs to test each in isolation to develop template & functions for using them, esp file I/O. Wrote formal spec in English & pseudocode of my problem with decomposed functions. Mapped almost 1-to-1 to FreeBASIC as expected. Did one refactoring. Executed code for first time to see successful run & solution to problem.

Yeah, BASIC is friggin awesome. I stretched industrial BASIC's, esp 4GL's, really far back when I was a hacker as they did rapid iteration, safe-by-default, and ran fast. Today, I'd say do a Wirth language like Oberon or Component Pascal instead. Still, BASIC has lessons to teach us and a painless bootstrapping for developer that many modern tools lack outside of scripting.

tdeck 2 days ago 0 replies      
One important thing about QBasic that's often overlooked is that it's a community of 100% amateurs.

I learned to program in 2004 in QBasic, although I very rarely fire it up anymore. By then it was not considered a "serious" language and we all knew it. But that meant the stakes were a lot lower. It's a lot of newbies and a few old-timers who do things for fun. This makes things a lot less intimidating for the non-coder. There are no conferences, no corporate sponsors, no polarizing big-shot Twitter personalities. It's a realm surprisingly insulated from the pissing contest of modern engineering culture and technical progress, where things that are acknowledged to be easy in other languages are nonetheless praised as impressive when done in QBasic.

Another thing is that QBasic has no decent library functionality for code reuse (QB4.5 has a linker but no one knows how to use it). This means people constantly reinvent the wheel, and they aren't told not to. People copy and paste snippets of code and learn how they work, but they don't build ever more complex mashups every year. Thus, you're not on the framework treadmill or starting 5 years behind the curve. Things are done simply and idiosyncratically, pretty much the same way they were done in 1995 or 1988.

Makes me nostalgic just thinking about it.

esfandia 2 days ago 1 reply      
What's nice with BASIC is that no context is required from the programmer: you don't need to know about the call stack or about variable scope. It's the programming equivalent of WYSIWYG. It also doesn't hurt that BASIC used to be resident in ROM (thinking back of the Sinclair ZX Spectrum), and that there were nice and simple primitives for drawing on the screen.

I agree with the blog author, and I still think it's the best language to learn programming. Everything else is software engineering, and gets in the way of the instant gratification that got so many of us hooked on programming.

What's wrong with starting with BASIC like I did, writing increasingly complex code, getting frustrated with the resulting spaghetti, then learning to use GOSUB more, which then gets you ready for a procedural language like Pascal? At least this way you know why pros don't code in BASIC. The same path of frustration (and fun) can also be used to go from Pascal to an OO language like Python or Ruby.

sevensor 2 days ago 1 reply      
My favorite thing about this story is that Basic was meant to be used exactly this way, for getting new programmers off the ground. And like everyone, I've got fond memories. We had an IBM PS/2 80286, with a whole megabyte of memory, a 20 megabyte hard drive, and MS-DOS three point something. At some point, I discovered that it had Basic, but it didn't have a graphical editor. It had edlin, which was still enough to write programs that told their users how awesome I was. Forever, with gotos.

And for the record, I don't think using gotos as a child sabotaged my ability to write structured code later in life. "GOTO considered harmful" gets taken way out of context!

hangonhn 2 days ago 1 reply      
"When developing a skill, it is much better to acquire the right reflexes from the start rather than have to correct years of bad practice."

I disagree with this statement. "Learning" is a long process of acquiring and also discarding habits and knowledge. The best intro I ever read for relativity was "An Equation That Changed the World" and it's a series of dialogues between Newton, Einstein, and the author (a modern physicist). Newtonian physics is relatively (haha) easy to understand. Once you understand that and then you can throw in some edge cases where it fails, you can introduce relativity and it makes relativity much easier to grasp.

Likewise for programming. I learned with QBasic. Then as the code got messier, I started using "sub", etc. and then started playing other concepts until I outgrew the language. Functional programming came to me the same way. I outgrew my existing methods and reached for more advanced ones.

All this is to say it's OK to start with easier and imperfect methods. We learn by acquiring new knowledge and then knowing them well enough to see their flaws before moving onto something better.

vancan1ty 10 hours ago 0 replies      
I think BASIC! for android by Paul Laughton is a modern good modern alternative for an entry to programming. It comes with good documentation and examples, and is a classic basic-type language.link:https://play.google.com/store/apps/details?id=com.rfo.basic&...

Once a kid might want to move up from that to solve perhaps more practical problems using PCs, I think a combination of bash (including basic usage of rsync, crontab, grep, sed, find, wget, ssh,...) and maybe VBA/Libreoffice Basic can take you a long way. (You can easily make nice little form/button based applications by dragging and dropping in the office suites. Also check out http://excelunusual.com/ for some inspiration on things you can do in VBA which I would not have thought practical!).

GregBuchholz 2 days ago 5 replies      
No love for Logo? I currently have my kids experimenting with ucbLogo and FMSLogo. And one of our latest endeavors is to hook it up to Minecraft [1], but that is certainly quite convoluted, with having to run minecraft servers with various plugins, and then working around the borked networking capabilities of FMSLogo, etc.. I wonder what other simple approaches there are to programmatically control Minecraft.


whatever_dude 2 days ago 0 replies      
I have so many fond memories of QBasic, and QuickBasic.

Coming from GWBasic, I knew it was not supposed to be the best language ever. I had been using Turbo Pascal, and knew the IDE to be great, and the language more structured. I was also using C, and knew it to be fast.

But QBasic was better where it mattered to me.

When I learned to tame it - to name things accordingly (variable names could have dots!), to not use line numbers or GOTOs, it was so easy and fun doing anything in it.

I could press F2 to jump to any function/method declaration I needed. I could press F1 on any method to quickly jump to its declaration/help, and then navigate help in the same manner (it was all hyperlinked!).

I could stop my program in the middle of the execution so I could issue inline commands, print vars, and even CHANGE it, before continuing execution... even today, with proper debugging/logging any modern IDE/platform provides, I never got a dev/test workflow as easy as that quick iteration cycle that QBasic allowed.

In time I started using QuickBasic instead (not the same thing!), learned to call interruptions, create actual executables, and do all kinds of crazy stuff most people were not supposed to do from there. Sure, running assembly from it wasn't as easy as Turbo Pascal (where you could do it inline) but as long as you had a library of common assembly calls created, it was just a matter of reusing it.

The language wasn't fast, and the compiled files were a bit bloated. In time I moved to something else. But I still miss the intimacy I had with the editor and the language. It's like a first love, the one that got away.

qznc 2 days ago 0 replies      
There are various good options:

Scratch https://scratch.mit.edu/

CodinGame https://www.codingame.com/games

Squeak http://squeak.org/

Logo (even in browser https://turtleacademy.com/)

JsFiddle if you want to go for somewhat longer javascript https://jsfiddle.net/

Alice http://www.alice.org/

brianpgordon 2 days ago 1 reply      
I think TI-BASIC is a strong competitor. A free-form keyboard can be overwhelming and the different things you can do aren't very discoverable. In the TI-BASIC environment on the TI-82/83/83+/84+, the keywords (Goto, While, Disp, ClrHome, etc) are tokens which must be inserted into the program by selecting them from menus. So you get a rich set of powerful commands (easily in the hundreds, even for the simple TI-84, and many more for the TI-89) but they're all discoverable through the normal menus and through the catalog menu. You start out knowing a few commands like Lbl and Goto, but through the process of scrolling over unknown commands (like maybe For or While in the control flow menu) repeatedly you start to wonder what they do, and then try them out, until you know what everything does and you can write the cleanest possible code. It's a wonderful way to learn.

And it supports both a text mode (with a "write string at grid position" operation for easy game making - text mode is a must on the CPU-starved TI-84+) and a graphical mode, with tons of stuff to do in both. For example, in high school I had a program that accepted a string of "DNA" and used it to repeatedly call the built-in pattern-drawing command with different arguments, producing a mosaic of overlapping patterns, so that each different input string produced a different final pattern. It's like 4 lines of code and kept me endlessly entertained in class.

And that brings me to its greatest advantage- its presence in the math classroom. In middle school and high school, math for a gifted student is boring as death, and you've got a little BASIC interpreter sitting right there on your desk. You can transform hundreds of wasted hours a year into productive learning time.

transfire 2 days ago 3 replies      
To be honest I don't think modern languages are all they are cracked up to be. Back in the day, very large and complex systems were managed quite well with languages like QBASIC, COBOL and FORTRAN, and ran rock solid for decades (hell, a few are still operational). But today, programming is such a polyglotic mess and conceptual navel-gazing nightmare -- it really is no wonder that so many projects end in failure.
equalarrow 2 days ago 3 replies      
Author is spot on. My son and daughter have about 2 and 4 years to go before I probably intro them to programming. When I do, it's going to be a C64 all the way. And a real one. With a real old school monitor and a luxury a didn't have for about a year or two: a floppy drive.

I've get a lot of the start with JS comments. But that would be the last thing I do. As a child learning, you need to remove the complexities and distractions of a modern system. Windows, browsers, edit js in one app, run it in another, use a mouse, open the js debug console, etc. Just wayyy to much stuff for a young mind to try to focus on.

Don't get me wrong, they probably _can_ learn js, but after decades of programming in all kinds of different languages, just because js is the latest fashion, doesn't mean I'd every try to teach it to a kid. No way.

BASIC, on the other hand, was created _for beginners_. I learned A LOT over the years when I was using BASIC. I also had magazines (COMPUTE! & Gazette) and books that helped immensely. I was just thinking yesterday actually, that it was awesome having the C64 Programmers Ref Guide that had the machine schematics and port pin outs in it!! You'll never get that for a Mac or iPhone nowadays (and these sorts of things usually only show up on 'hacker' sites). This was a book put you closer to the hardware so you could more easily understand what PEEK and POKE really did. :D

I'm _really_ looking forward to the instruction days. The C64 was so popular for a reason: it was powerful & simple. This is how I want to introduce my kids to programming. They'll figure out Ruby, JS, C and things (maybe 8-bit assembly too!) later.

I loved my C64 (still do) and I hope that will rub off on them. Can't wait.

mrbill 2 days ago 0 replies      
Don't forget Ahl's "101 BASIC Computer Games"!

I spent hours and hours making stuff written for DEC/MS-BASIC work on my TI-99/4A with its limited language implementation..


PDF: http://bitsavers.trailing-edge.com/pdf/dec/_Books/101_BASIC_...

I had the later yellow-cover edition myself.

ender672 2 days ago 0 replies      
QBasic runs flawlessly in your browser at Internet Archive - https://archive.org/details/msdos_qbasic_megapack
cesarbs 2 days ago 0 replies      
QBasic was my first exposure to programming, when I was 9 years old. Fond memories. Despite it having what today I consider terrible syntax, it made programming extremely approachable. After a quick "hello, world" intro from an IT guy at my mom's office, I spent countless hours reading the help pages and learning the commands. Sometimes I wish programming could be as simple again.
drhayes9 3 days ago 1 reply      
I learned to program on an Apple ][, where the programming environment was always a CTRL+RESET away. BASIC was great. I carefully typed long programs that I found in the back of computer magazines, broke them by changing stuff, and fiddled around endlessly. Those incantations are still in my fingers (CALL -3116 for a nice visual effect, PEEK -16384 for the current keyboard key, VTAB and HTAB to get around the screen, etc).

I volunteer at a local Coder Dojo and there's nothing to give these kids who want to fiddle around like that. Scratch is the closest I've found (with the bonus of not having to necessarily be able to read or type), but it's still cluttered with logins and passwords... and, being a GUI, it suffers when the kid's computer doesn't have a big enough screen to manage the complexity they eventually create. It's a shame. I tell kids who've grown tired of Scratch that they can "graduate" to Stencyl, a game engine that uses a similar visual programming metaphor but lets you drop down to the Haxe beneath. It also compiles natively to desktop or mobile, so that's cool too.

Python, ruby, and JS are all really close to the ideal "type and go" environments, but they're also littered with speedbumps. Installing packages and keeping the environment sane are difficult enough for some professional engineers, let alone kids who just want to mess around and make cool stuff.

The woman who started my local Coder Dojo got her son started with his own Linux computer at a really young age and he's a whiz at it, so maybe there's no real problem here and I'm just underestimating the ingenuity of these kids. They'll get it done if they really want it, I guess?

EDIT: Kinda apropos, but check this out! http://beagle.applearchives.com/the_posters/ I would've killed for these posters when I was a kid.

dkopi 2 days ago 0 replies      
QBasic was a truly incredible way to start programming.Today I'd probably recommend python with pygame, and especially the book inventing computer games with python:https://inventwithpython.com/
guard-of-terra 2 days ago 2 replies      
I don't think that QBasic is so bad as procedural language.

All the primitives are there and I've never written any code with much GOTOs in them. Just did not feel the need even when I was 10 years old.

What this article clearly lacks is

Seriously, unsurpassed easiness of programmable graphics. Even by javascript, which is otherwise better for teaching.

popmilo2 3 days ago 1 reply      
Nice to hear about such success with a kid learning coding ! All the best!

There is newer, multi platform implementation called QB64http://www.qb64.net/

educar 2 days ago 1 reply      
There is another ritual after you start with QBasic. Your open gorilla.bas and nibbles.bas.

Seriously, play gorilla.bas. It's one of the best games I have played.

dunkelheit 2 days ago 0 replies      
QBasic is what got me started with programming more than 20 years ago. The thrill of writing a program that draws a rectangle, than erases it and draws slightly to the left and then watching the rectangle zoom across the screen hasn't been surpassed ever since.

If you want to teach a visual person like myself programming and the programming environment requires more than one statement to put something on the screen (including import statements), it has already lost to QBasic.

vmorgulis 16 hours ago 0 replies      
BASIC-256 is a good alternative to QBasic: http://www.basic256.org/index_en

The voice synthesis with the "say" command is impressive: http://doc.basic256.org/doku.php?id=en:say

Say "Hello, World" (instead of Print...)

fit2rule 2 days ago 0 replies      
My kids have a great time with PICO8:


The fact that everything you need to build a game is built-in is very key to their enjoyment - they don't need to deal with any operating system in this environment, and have (almost) everything they need to be self-contained developers. (Built-in docs would be nice..)

Its interesting, having watched the evolution of PICO8, to see the mass explosion in creativity around the scene, and I have observed a key to its success - by putting constraints on what you can do and how you can do it, while also filling in the inner space of those constraints with excellent tooling, you give people freedom - within a limit - to create amazing things that push those limits.

You can read more about PICO8 (if you're interested) in this wonderful 'zine, which reflects the culture of the scene very well:


Another big hit is LOAD81, from antirez:


This, I think, is better than QBasic, in that it has all the same functionality in the IDE, but teaches them a better language: Lua. Plus you can install it almost everywhere you can't install QBasic.

Anyway, in case any other HN'er out there is interested in doing stuff like this with their kids, give PICO8 and LOAD81 a try .. they're both, in my opinion, great ways to get young minds programming ..

jheriko 2 days ago 0 replies      

I learned code from QBasic and its help because books were in short supply at the library and there was no internet for me.

As much as Xcode or VS or Qt creator are great tools they seriously lack in that regard, as do most modern development environments. QBasic is very easy to learn and easy to use - and trying to solve hard problems with it requires some genuine skill and ability...

mayreck 2 days ago 0 replies      
Honestly this thread put a tear into my eyes....I was literally your kid like 15 years ago when i first discovered quick basic on my own as a 10 year old. I feel this so deeply. It was much easier to learn that than anything else I've ever learned since.
aurelian15 1 day ago 0 replies      
I couldn't agree more. I started programming with QBasic -- basically teaching myself on my own -- when I was seven. And it worked. I was programming little interactive stories, drew crazy stuff on the screen, wrote small games. I was happy with it. The only things I was always curious about (and know one I knew back then could help me with) was on how to access the computer's mouse and the sound blaster. With my limited knowledge I never got that working until I switched to Delphi and Windows four years later.

One of the best things I remember about QBasic is the help system (also referred to in the article). This where I found everything I wanted to know, including examples, about the language. I believe that's one of the reasons why I'm still more patient when it comes to studying software documentation compared to the average programmer.

cyberfart 2 days ago 0 replies      
I remember doing the same. Starting with a Qbasic book I had bought with the little money I had, I was already writing mud-like games and calculating prime numbers after only an hour. Though I wasn't lucky enough to have done it at that age. Lucky kid :)
ccozan 2 days ago 1 reply      
Ha! Who didn't hack the gorilla throwing bananas? :)

I learned to programm with BASIC [0] before on a Spectrum ZX clone, however that GORILLA.BAS [1] was a good programming lesson. I remember even now, after 25 years, figuring out the algorithms and the parameters of the flying banana.

[0] https://en.wikipedia.org/wiki/Sinclair_BASIC[1] http://www.jefflewis.net/archive/programming/gorilla.bas

smilekzs 2 days ago 2 replies      
I am curious what's there to stop a kid from starting a Jupyter notebook (or even Mathematica with all the FullSnakeCaseNames) and firing away? Sure it isn't complex at least wrt. the basics (pun intended)?
thom 2 days ago 0 replies      
It's mentioned in the article, but Microsoft have a nice BASIC environment called SmallBasic[0]. It's got a decent editor with IntelliSense, there's a community with a bunch of programs shared online[1] and it can interface with .NET code so you can extend it with whatever shenanigans your kids might require.

[0] http://smallbasic.com/[1] http://smallbasic.com/program/?TETRIS

simonh 2 days ago 0 replies      
I started with MBasic on a Superbrain running CP/M, a very early ancestor of GWBasic and QBasic.

I wouldn't start kids on Basic now. My daughter is learning Simple Basic at school, but we use Macs at home and the only way to run it is in a Silverlight container. All the variations of Basic I know of are too restricted. Don't get me wrong, Basic back in the day was is a fine learning tool. The fact you could PEEK and POKE memory and DIM arrays actually taught you low level programming and system concepts that are abstracted away by modern languages and environments. Those early Basics were truly suspended part way between high level languages and machine code.

I appreciate the advantages of JavaScript, but also it's flaws. Learning to pop up alert messages is a cool hack, but beyond that it gets messy fast.

Scratch is fantastic. It's cross platform, can run in the browser and specifically designed to be very visual and intuitive. You can watch the code run. Brilliant.

Beyond that though a learning language needs more. It needs to be cross platform, support both procedural and object oriented forms and be capable of text, graphics, UI and web programming. For me that means Python. It helps that I've got pythonista on my iPad and phone. On the desktop there's also Jupyter notebook.

At the end of the day though, teach what you know and what you enjoy. All the technical arguments in the world can't beat passion.

carlmcqueen 3 days ago 1 reply      
I learned to program on Texas Instruments calculators in the same form as this seven year old's application, I made really elaborate choose your own adventure games with if statements.
mosburger 2 days ago 0 replies      
GW-BASIC (QBasic's kinda sorta ancestor) was my first programming language (well, other than Logo I suppose). I still have a soft spot for it and might not be a programmer today if I hadn't first learned on such an easy-to-grok-for-11-year-old-me language.

I've toyed with writing my own GW-BASIC interpreter for yucks (because I've never written an interpreter) but never gotten around to it. Perhaps I should... I've been meaning to learn Rust, maybe that'd be a good excuse?

JustSomeNobody 2 days ago 0 replies      
Huge props to Noah for keeping an actual paper developers journal!
partisan 3 days ago 1 reply      
This made me smile. I remember trying to learn coding at 6 or 7 from my Tandy 56K Basic manual. I didn't get it but I tried all the same. It would have been cool to have someone there to help me understand.

It's a fortunate thing to have common interests with your child. I can't wait to have my daughters show even the slightest interest in coding. They are toddlers, so I have to just wait.

ErikAugust 1 day ago 0 replies      
That is awesome!

It is not surprising that Noah's desire to program comes from his desire to make games. That's where I got my start, with QBASIC ~20 years ago.

Before that, I remember wanting to take a Entenmann's Donut box and put action figures on sticks to make my own analog fighting game.

How many of you got into programming because of a desire to make games?

partycoder 2 days ago 0 replies      
As a person who taught himself programming (and a bit of English) with QBasic, and also have known many people that did the same. I agree with this article.

But you have other BASICs such as the one for computers such as the ZX Spectrum.

Python in my opinion can also be used to teach programming, since many high level constructs are optional and syntax is simple.

edtechdev 2 days ago 3 replies      
Try to teach your kids programming this way, and you are more than likely only going to turn them off programming for many years. There is actual research on teaching kids how to code.

Personally, I would start with code.org or Scratch. If you have an ipad, there is Hopscotch. Here is a huge list of more tools to teach young kids programming: http://bit.ly/ortonacode

But if you start with basic or python, you'll be teaching them that coding is basically similar to solving boring puzzles - not of any real use, and not as fun as videogames.

Start instead with something they would find motivating or useful, not you. Making a game, or creating a useful little app/tool that solves some problem they think is important, etc.

jay_kyburz 2 days ago 0 replies      
I'm a bit late to the party on this one but if you are looking for a great way for kids to learn basic and make video games, check out http://www.blitzbasic.com/
camperman 2 days ago 0 replies      
Wonderful post. I did something similar for my younger daughter and one of her friends with Lua and Love2d. In just a couple of hours they had written a Breakout clone and also understood every line of code. Very satisfying.
jordan0day 2 days ago 1 reply      
One of my disappointments with the original Raspberry Pi was that I felt that its genesis story was from people whose early experience with programming came from Apple II's and QBASIC. It was pitched as a way to expose kids today to that same kind of low-overhead, get-in-and-start-making-things-happen experience some of us had 25 years ago.

Imagine my disappointment when the Pi's getting started experience was "Boot into a graphical window manager, open up the Python IDE, start writing Python..."

demarq 2 days ago 0 replies      
I learnt Qbasic not that long ago it was my first taste of programming. However back then I couldn't afford a computer, so I had this Qbasic book and just had to 'imagine' how my programs run.

Fast forward 13ish years and I own my own mac and know more programming languages than most of my workmates :) Although now I try and specialize in just three Rust, Javascript and Python to avoid being the 'master of none'.

agumonkey 2 days ago 0 replies      
Last month I reconnected with QBasic, was surprised about the live formatter and linter. And the pseudo module handling. Each function being seen as a separate entity.

Sorry MS, I didn't understand how valuable it was.

z3t4 2 days ago 0 replies      
I think Qbasic was so fun because it was easy do do graphics. Doing graphics in any other language is just insanely hard as you have to understand university grade math.
epicmellon 1 day ago 0 replies      
Oh man I am having a nostalgia heart attack right now. I remember coding a "guy shooting falling aliens from the bottom of the screen" game and hilariously having the aliens move not only through a timer but ALSO when you moved since there was no threading. Classic.
someone_ 2 days ago 2 replies      
I agree- we should also have just as easy of a way to start off with drawing.

Why do we need to install 50 frameworks, tell a window how to spawn, basically create a universe just to start experimenting with creating computer graphics? It's a (Width x Height) matrix of RGB values-- why can't we get a simple way to create a simple 100x100 box for kids to draw in?

No reason your son couldn't tell a screen to print colored pixels in RGB value with the right syntax.

nxzero 2 days ago 0 replies      
Key to get people coding, doesn't matter what the language, platform, etc. is in my opinion. Think I've worked in more than 20 or 30 languages to help someone new to programming write some programs. Each language has it own set of strengths and weaknesses, but normally my experience is that the person having a choice, is really important to them sticking with it.
ajuc 2 days ago 1 reply      
I introduced my 10 years old cousin to turbo pascal when we met a few years ago. It took less than 3 hours to download it, install (also the patches to run on modern computer), make a game where you can move a circle around the screen, and teach him how can he modify it and how basic syntax works.

There's nothing like that in modern programming. Python comes close, but is too magical.

agentgt 2 days ago 0 replies      
I think Racket comes close but is still a little more adult.

I remember going to GaTech early 2000 and there was a teacher named Guzdial pushing Squeek like it was gods gift to children and humanity. There is nothing wrong with the language but I have yet to see kids use it or like it. I have seen high school kids use racket though.

eric001 3 days ago 0 replies      
Wow... good job man. My first experience with programming was QBasic, and that was about 16 years ago. I only did it for a couple of weeks but when I picked up on programming several years later as a young adult, it all came back to me - assigning variables from user inputs, printing on screen, if then else, all the basics. Good stuff!
imakesnowflakes 2 days ago 0 replies      
I think kids should not be actively encouraged to learn to code early.

You can learn to code any time, when you are older. As far as I can see, there is no advantage in learning to code early.

But there are skills you need to pick up, and experience that you can only have when you are a kid. Let us encourage to kids to do those stuff..

rbanffy 2 days ago 0 replies      
Who'd a thought thirty years ago we'd all be sittin' here drinking Chateau de Chassilier wine?
darkstalker 2 days ago 0 replies      
I also started with BASIC. First was Commodore BASIC and then QBasic. I remember there was also LOGO, one of the original didactic languages.IMHO, if you want a simple language in the same spirit as BASIC, the closest you can find is Lua.
dudul 2 days ago 0 replies      
My post won't add much to the discussion, but QBasic was also my very first language. I was a little older than Noah when I first used it (I think I was ~12/13) and I still remember having so much fun with it :)
mark-r 2 days ago 1 reply      
So who will do an equivalent of QBasic for the web? It seems that the language is simple enough to be implemented entirely in Javascript. Then you wouldn't have to deal with the hassle of installing.
bitJericho 2 days ago 0 replies      
The Monkey X programming language is a great alternative to qbasic. It's simple to install, cross platform, procedural, OOP, and incredibly powerful the more you get to know it.
thedaemon 2 days ago 0 replies      
I remember learning to program with QBasic as well. It was so much fun as a kid. I painstakingly drew out full scenes with line and circle commands for a small text based adventure game. Good times.
artur_makly 2 days ago 0 replies      
im getting my toddler started with this RL code. http://www.primotoys.com/cubetto
kazinator 2 days ago 1 reply      
I tried (+ 2 2) on a four-year-old once.

Instant, complete acceptance; no complaints about parentheses or lack of infix or anything.

nhlx2 1 day ago 0 replies      
Emacs, then elisp perhaps?
mgalka 2 days ago 0 replies      
OMG please dont teach him GOTOs!!

GOTOs are easy to abuse, but they would really come in handy sometimes.

Annatar 2 days ago 0 replies      

an acronym for Beginner's All-purpose Symbolic Instruction Code.

Yep, I'd say that checks out.

liveoneggs 2 days ago 0 replies      
of course: http://www.nicholson.com/rhn/basic/ chipmunk basic
jeremywen 2 days ago 0 replies      
sonic pi was created for this very purpose - to teach kids to code - http://sonic-pi.net/
jedateach 2 days ago 0 replies      
:') My first code was written in QBasic
adamwong246 2 days ago 0 replies      
Man, I wish I'd kept some of my .bas files.
EugeneOZ 2 days ago 0 replies      
My first language too :)
username3 2 days ago 1 reply      


QBASIC in a browser

spriggan3 2 days ago 0 replies      
Rebol ?
thrownear 2 days ago 0 replies      
Less about Qbasic and why it is still the best and more about a parent bragging about their kid doing stuff with computers so early.

I mean, just make the title "my kid wrote a text game in Qbasic" and do your bragging there. No problem..

The Startup Zeitgeist ycombinator.com
466 points by loyalelectron  3 days ago   152 comments top 36
minimaxir 3 days ago 7 replies      
While there certainly is a lot of data, and it is presented very well, the analysis falls into the same traps as analyses of Google Trends data, particularly in the keyword analysis. Causation of changes in trends is ambiguous.

For example, YC applications are optimized for the highest probability of getting into YC by definition, and so it would be expected that terms on the application would be optimized for getting venture capital, not necessarily what is happening in the marketplace today. (The "Slack: King of the Enterprise Tools" slide is a good example; I'm sure the mention of "Slack for X" is a golden flag for some VCs)

A question about the chart presentation: why are some lines straight, and some lines smooth? (e.g in the Startup Competitors slide, Instagram, Airbnb, and Uber have curbed lines but Tinder, Whatsapp, and Snapchat have straight lines)

alain94040 3 days ago 3 replies      
Two observations:

web vs. app data seems to be missing something. According to that graph, about 15% of startups are building either an app or a website. Where are the remaining 85% doing? That sounds strange.

I can't underscore enough the "crystal ball" effect of receiving all those pitches. By hearing every idea out there, you get a free crash course in the future. While founders don't like to hear it, a similar idea has already been pitched 10 times. So investors focus on how you are different from those 10 similar ideas. And often, investors will have "amazing" insights for you, just by repeating what those 10 other founders have told them already. It's like being prescient without being especially smart. A very strange feeling.

cageface 3 days ago 5 replies      
Within mobile devices, the iPad was mentioned specifically very often after it first came out. Now its mentioned rarelyprobably not because people dont build apps for iPads anymore, but instead because its simply so obvious that you will support iPads that people dont even mention it.

I'm not so sure about this. Outside certain niches developers don't seem to have much interest in iPad apps these days. In three years of building iOS apps for clients I haven't had anyone ask me even once for an iPad version of their app.

refrigerator 3 days ago 2 replies      
I'd be interested to see whether there's any link between how an application is written (stylistically) and whether or not they get into YC. The writing itself plays quite a big part in many other "similar" processes, like applying for scholarships or research grants, to which there definitely is an "art", so I'd be curious to see whether this is the case in applying for YC as well.

Could uncover some non-obvious bad patterns that future applicants could be warned against using when applying, and some non-obvious good patterns that can help people write better applications, but it might also uncover a subconscious bias within YC towards certain patterns/styles of writing that YC might want to try to remove.

Would definitely be harder to do than this keyword analysis that Priceonomics did, but I'm pretty sure lots of people have worked on analysing styles of writing etc. over the years so it's definitely doable.

rdl 3 days ago 1 reply      
I wish this were a continually updated thing after each batch.

Some other trends I'd love to see: team sizes (mean/mode/median). Capital raised pre-YC. Revenue pre-YC. Years in existence pre-YC. Location pre/post YC.

cperciva 3 days ago 0 replies      
I'd love to see information on founders over time: % applications with 1/2/3/many founders; % founders by highest educational qualification and university; % founders by city/state/country.

And of course separate graphs for all applications vs accepted applications.

gtrubetskoy 3 days ago 0 replies      
What I think is very curious is that Apple isn't mentioned even once on this whole page. And yet it's behind much of the trends and pursuing many a technology listed on it.
noahmbarr 3 days ago 0 replies      
Caution flag: There might be a statistical significance issue on a good chunk of these graphs.

(1) Particularly when the scale isn't fixed (.001 increments to 2% increments - a 200x difference!)(2) When making conclusions on relative trends given small small percentage movements year over year.

ericjang 3 days ago 1 reply      
This is really fascinating. An quick eyeball glance suggests that Google Trends lags behind this data by 1-2 years (correlation isn't strong, but seems to be present).


One could have used this data to long shares of public companies like Google, Facebook, Twitter, Microsoft, and short Ebay, Yahoo, and Myspace.

Crystal balls have many uses ;)

dxbydt 3 days ago 1 reply      
AI going from 0.3% to 1.8% is not a trend unless you adopt some Clintonesque redefinition of the term. Similarly for the other graphs. If such tiny percentage changes were actually trends, the stock market has millions of such trends every single day & any of the numerous trend trading systems would have minted millions of millionaires by now.
zanalyzer 3 days ago 2 replies      
'drone', 'drones'

'vehicle', 'vehicles'

are counted as separate terms. Joining them would make more sense to me.

gbog 3 days ago 1 reply      
I'm surprised that apps are still so popular, I thought silicon valley had already realised that users do not want to download apps upfront and manage them later, they want web apps or pages for their mobile devices. Maybe the trend is there, but not very explicit in these curves.
isaacn 3 days ago 3 replies      
I'm shocked that I do not see Amazon or AWS in these lists.
lowglow 3 days ago 0 replies      
This is completely propaganda. "Zeitgeist", "Crystal Ball" -- These are all lagging indicators and gives no authority to YC as having some predictive capacity to the world of technology.
alecco 3 days ago 0 replies      
So hype analytics? How is that useful?
r3bl 3 days ago 0 replies      
I'm surprised to see Oculus being separated from VR. I can't figure out the reason behind it (other than it being the most known VR product to this date).

Plus, I'm kind of disappointed personally to see the term "anonymous" dropping so quickly. I do fully understand the drop in the blogging-related tools (not that it's going anywhere, just that it became a technology where there doesn't seem to be any groundbreaking changes going on for quite a while).

vit05 3 days ago 1 reply      
I'm curious if next year we will have a third player in the graphic web vs app.
it_learnses 3 days ago 1 reply      
I am curious about why India - the only country is in there and why it's at #30.
tnorthcutt 3 days ago 0 replies      
Somebody got a little lazy when it came time to smooth the biotech graph, huh?


Dagwoodie 3 days ago 0 replies      
FYI, the phonetic sound of a German 'z' is the English equivalent of 'ts'. I'd hate for such a novel word to be mispronounced by readers here, as it is on many Youtube videos.
asitdhal 3 days ago 1 reply      
Very less people think Microsoft is a threat. This shows, no body wants to build an office app(like Excel or Word). We don't really see a lot of innovation in slide show applications(I used powerpoint in 2006 and now I also use the same).

The world also lacks innovation in Desktop App. Web Form and C# is very popular and Java Swing seems slow or not attractive. Qt has confusing Licensing terms.

I guess someone should think about some good frameworks for Desktop(like a universal look and feel in Javascript or some open specification).

tdaltonc 3 days ago 0 replies      
I'd love to see a contrast between apps received, apps invited to interview, and apps accepted (but I imagine that they're keeping that close to their chest).
hbhakhra 3 days ago 0 replies      
It would be very interesting to see a split of these applications vs those that were accepted and those that are still around/exited successfully.
dfischer 3 days ago 2 replies      
Wow the momentum of Slack is crazy. 850% increase closest to the next highest increase which is 211% (vehicles).
sskates 3 days ago 1 reply      
This is awesome! I love the competition slide how more startups are mentioned than big companies. I remember back in 2010 a response investors (even YC!) had to almost any idea was "won't Google do that?" Now it's not strongly on people's radar.
nxzero 3 days ago 0 replies      
It'd be more interesting to see analysis of the 1000+ startups YC hasfunded, not the apps.
paulsutter 3 days ago 0 replies      
This is best read as a way to stand out from the crowd and avoid overhyped and tiresome areas.
gavanwoolery 3 days ago 0 replies      
Not trying to be the least bit negative, but objectively speaking it feels less like a crystal ball and more like people reactively gravitating towards the hottest trends. Nonetheless, cool to look at the data. :)
Denoay 3 days ago 3 replies      
What? Some people in 2016 said, that MySpace is their competition? Srsly?
davemel37 3 days ago 0 replies      
I would be much more interested in contrasting the applications of teams accepted and teams rejected from YC.
RA_Fisher 3 days ago 1 reply      
Wish there was a representation of uncertainty in the graphs.
uptownfunk 3 days ago 0 replies      
What did you guys use for those plots?
benhamner 3 days ago 0 replies      
Are you in a position to share a raw, granular form of the data (potentially with more anonymization)?
tassid 3 days ago 0 replies      
lots of info.
mozumder 3 days ago 3 replies      
Why do people believe VR has any potential? Aren't its design flaws obvious?

Nobody would actually want to use it.

LAMike 3 days ago 1 reply      
"building things on top of the underlying blockchain is on the rise"

You'd still be using Bitcoin if your building things on top of the blockchain.

Elsevier Complaint Shuts Down Sci-Hub Domain Name torrentfreak.com
443 points by yunque  3 days ago   226 comments top 26
leni536 2 days ago 4 replies      
(Thanks for daveguy: https://news.ycombinator.com/item?id=11593881) sci-hub.io cert)

https://sci-hub.cc same ip as above, sci-hub.io cert)

https://sci-hub.ac (same ip as above, sci-hub.io cert)

https://sci-hub.bz (uses a separate certificate and ip address --

And a tor site: scihub22266oqcxt.onion

chias 2 days ago 7 replies      
I feel a little tingle of excitement seeing my own papers on Sci-Hub. I mean I get that they're trying to index all publications so it's not a "stamp or approval" or anything... but it does mean that people can actually access the knowledge I tried to throw into the world, which was kind of the whole point in doing it.

I'd update my academic website to link my papers to their sci-hub URLs if I didn't think I'd catch a world of flak for it.

jfaucett 3 days ago 5 replies      
"As a result of the legal battle the site (sci-hub.io) just lost one of its latest domain names. However, the site has no intentions of backing down, and will continue its fight to keep access to scientific knowledge free and open."

Does this not enrage people? Elsevier and closed-access journals like them, are doing all they can to impede human progress while leaching off of tax-payer dollars to do so. Something should be done to make what Elsevier and the like do illegal, are there any groups/political parties/etc going after them?

lake99 3 days ago 1 reply      
On the bright side, this will give some much-needed publicity to Tor Browser. Sci-hub is still available at http://scihub22266oqcxt.onion/
thesimon 3 days ago 0 replies      
Why exactly was a New York court able to issue an injunction for a ".io" domain?

It is a British Territory extension being managed by "Internet Computer Bureau Ltd" based in the United Kingdom.

zo1 3 days ago 2 replies      
Is there a donate location of sorts that we can pay into to support the efforts that the sci-hub person/team/organization is doing?

This is a another good cause that I would find worthy to donate to.

Edit. Ok, found it:


Apparently, only bitcoin donations are possible at the moment. BitCoint Wallet for donations: 1K4t2vSBSS2xFjZ6PofYnbgZewjeqbG1TM

chrismonsanto 2 days ago 3 replies      
> Meanwhile, academic pirates continue to flood to Sci-Hub, domain seizure or not.

I realize readers of torrentfreak.com have a different relationship with the word "pirate" but--

This is not piracy. You are entitled to read work from Sci-Hub, as your taxes funded the researchers who created the work. We academics want you to read our work, we do not benefit in any way from publisher paywalls. We continue to publish in these venues because it is necessary for career advancement (whether for us, or for our students).

Please continue to "pirate" our work, and please spread the word about this problem.

atemerev 3 days ago 9 replies      
While I support open science and Sci-Hub is great, I also can see publisher's side.

Imagine the same site with pirated high quality scientific books. (There are some, but in darker corners of the Internet). Would open access to these books be advantageous for humanity? Most definitely. Will publishers get mad about it? Even more definitely. Should publisher's work be free? I think not.

However, you can't get both sides of the coin at once. Either authors are paying publishers to get their papers peer reviewed and published (they do now), or publishers may collect payments from readers and libraries. Not both ways.

cessor 2 days ago 1 reply      
"We wish to set back society, because we want to make more money".

I use Sci-Hub all the time. I can access all papers via my library subscriptions so I actually DO have paid access to everything, but searching with and authenticating against my university library's services is slow and tedious and not feasible if I want to download many papers quickly.

grownseed 2 days ago 1 reply      
Recently shared this here: http://www.thebookseller.com/news/elsevier-defends-its-value... (https://news.ycombinator.com/item?id=11614926) which I'm guessing is the precursor to this debacle.

Reading what Elsevier have to say, you'd think they're either completely delusional (I doubt so), or know full well that they're rapidly becoming irrelevant/abhorred and are adopting a bully position, all the while pretending to be innocent and blameless victims of this situation.

Don't count on me to shed a tear when Elsevier go under.

nyolfen 3 days ago 1 reply      
.cc still works for those who need it
chinathrow 3 days ago 1 reply      
I would love to learn more why Elsevier _only_ reports a profit of 37%?

I've read here on HN and elsewhere multiple times, that most of the work done is outsourced to third party labor, which is done essentially for free:

- writing papers

- organising reviews

- reviewing papers

wyager 2 days ago 1 reply      
Domain names should not, in principle, be subject to the arbitrary edicts of governments. I hope that, in the future, we switch to a decentralised and cryptographically incorruptible system a la Namecoin.
mnl 2 days ago 0 replies      
Yet again, please don't mention other sites here or in reddit etc. Being in the spotlight is not good for them. What they are doing is morally fine in my book, but infringes the law and you are contributing to their demise. Keep it low profile, the more you blab about them the sooner they are gone. And until the next platform makes it -if it does, you can't guarantee it will- it is going to cause a lot of damage to many careers out there, because there are many people that have no choice but this one. If this is a war, it is a war of attrition, it can't be won by pointing out the targets.
chris_wot 2 days ago 2 replies      
I wonder if there is a way of distributing a signed file that contains the IP address of sci-hub over a distributed medium?

Someone else would just need to write a small script that copied it into the system's host file - but make it generic enough that it works for anything. That would help non-technical users.

This would stop courts from going after centralised name servers.

There really needs to be a way of decentralising name servers.

jheriko 2 days ago 0 replies      
As much as I completely support freedom of information there is a problem with not respecting the law that seems to be growing...

I might disagree with the law. I might break it intentionally, but I do my due diligence and do it knowingly.

When i get caught I hold up my hands and face the consequences because I entered into that knowingly. I have actually done this irl. This is why I have no respect for this, or for Uber, Aaron Schwartz or this entire ridiculous movement of subversion instead of facing up to the challenge and taking it extremely seriously and tackling it up front and openly with some pride and courage.

I'm glad people are fighting to keep this alive anyway. That imo is part of the right approach, to defeat the law by showing its futility in the face of reality when it is very far into the wrong. But ffs, don't hide from what you have done like some kind of child. Face up to your responsiblity and take it on the chin... and let people be angry about that, because that is an actual, serious wrong.

"Words are wind."

x5n1 2 days ago 0 replies      
Long live Aaron Swartz
thaw13579 2 days ago 1 reply      
Maybe this is an opportune time to bring up a related concern. It's crucial that any system for disseminating scientific work be reliable and persist indefinitely. I don't like the current system, but I wonder how this will be achieved otherwise.
alwaysdownvoted 2 days ago 0 replies      

 echo " elsevier.com" >> /etc/hosts

return0 2 days ago 0 replies      
trivial but handy bookmarklet link for scihub:


joelthelion 2 days ago 2 replies      
Are people working on a better system to replace DNS? This little game is quickly becoming tiring.
gherkin0 2 days ago 1 reply      
How big is the sci-hub database?

Are there full/partial mirrors published anywhere?

plg 2 days ago 0 replies      
why not set it up as a tor hidden service?
cloudjacker 2 days ago 0 replies      
yeah but where's the torrent?
michakirschbaum 2 days ago 0 replies      
Looking forward to watching the Streisand Effect play out.
blue_dinner 2 days ago 1 reply      
Just like with music and file sharing, this will not hurt the large corporations. It will only hurt the researchers that depend on these papers for funding and make it more difficult for them to make a living in the future.

It will also push companies to keep research more private and proprietary. Why would I spend millions of dollars on research, only to have it freely distributed to everyone, including my potential competitors?

I've never witnesses a time where more people fight to give up more and more of their own power and hand it over to governments and large corporations on a silver platter...and then complain when it's all gone.

Bitcoin 'creator' backs out of Satoshi coin move 'proof' bbc.co.uk
396 points by blacktulip  3 days ago   279 comments top 50
Geekette 3 days ago 4 replies      
Given his history as a prolific liar, I find his post to be utter bullshit. Note how he's still lying about things he was caught at: "When the rumours began, my qualifications and character were attacked. When those allegations were proven false". NO - it was established (with confirmation from the schools in question) that he lied about having a PhD from Charles Stuart U. and he definitely does not possess 8 masters degrees. Not to mention other lies about having super computers and partnership with SGI to build more with fake reference letter (all clarified by company as false), etc.

Now, because he knows he can't successfully claim Satoshi's identity and in light of possible charges based on ongoing police investigation (fraudulent use of tax credits), he wants to dramatically disappear. I hope the authorities have his passport(s). His thirst for fame is unreal.

spdustin 3 days ago 10 replies      
Okay seriously, is anyone contacting the authorities to see if this man is still alive? I do not know who to contact or what his location is. His "blog" post sounds like a suicide note - I've had the misfortune of reading others, and there are telltale signs - and it doesn't appear that anyone is taking steps to see if he's okay.

Maybe he is a liar, maybe he's Satoshi, who gives a shit at this point when the jaded comments make it sound like we've been invaded by 4chan.

Grazester 3 days ago 4 replies      
"But, as the events of this week unfolded and I prepared to publish the proof of access to the earliest keys, I broke. I do not have the courage. I cannot."

This guys is so full of crap its amusing

gtrubetskoy 3 days ago 6 replies      
My (non-provable) theory is that Satoshi is not a person but a team, now defunct, that was tasked with developing a crypto-currency. They are probably more scientists/mathematicians/economists than they are software developers, which would explain the very sound but not exactly elegant initial code.

It would also explain the silence - the project is over, the team is probably bound to secrecy because this stuff is classified.

The only time Satoshi spoke out (edit: and we don't even know that it was authentic, see discussion below) was to help the poor bloke who was mistaken to be Satoshi and his life was being ruined, and that was probably a collective decision by the former team members, it was the right thing to do.

This is a different case - Satoshi speaking out wouldn't help in this case, Craig sounds like someone in need of help from a professional psychiatrist or psychologist, http://www.drcraigwright.net/ almost sounds like a suicide note. I feel sorry and worried for the bloke.

celticninja 3 days ago 1 reply      
Dude's a liar. After everything else he has done this is the simplest step which requires no extra burden on him. He has already claimed to be SN, the only reason not to follow through is because he cannot and could never have. Therefore he is just another pretender to the throne.
spilk 3 days ago 0 replies      
Why is there even an argument? It is not rocket science to produce a signed message with a key, the reference software can do this trivially. Until that is provided I don't see why anyone should entertain the possibility. Anything else is just theatrics.
tazjin 3 days ago 1 reply      
Can we just stop giving this con man more attention?
tommynicholas 3 days ago 1 reply      
To those of you who have never known a conman or someone with borderline personality disorder, this behavior might seem odd. To me, it's like looking at a picture of a dog and going "oh that's a dog!"
threatofrain 3 days ago 0 replies      
It's hard for me to believe that he's backing out of his promise to provide stronger evidence because the pressure is too much, because the pressure of being known as a fraud is a lot worse than being known as an inventor.
andremendes 3 days ago 3 replies      
Source: http://www.drcraigwright.net/

Asides the conversation, the whole source code of this website is: <img src="homepage.jpg">

This man is really not caring.

EDIT: site is updated now, with proper html+css coding.

mabbo 3 days ago 1 reply      
I hope the actual Satoshi is alive somewhere, and just for fun moves a different original coin, as if to say "I am alive, and Wright is not me".
AKifer 3 days ago 2 replies      
The only blamable thing in this story is that the so called mainstream media are not technically knowledgeable enough to "scientifically" check their news sources. Good thing, all the nerds around have exposed the scheme in such a short time, I like this era !!
kazinator 2 days ago 0 replies      
The Nick Szabo hypothesis is plausible, see:


>In December 2013, a blogger named Skye Grey linked Nick Szabo to the bitcoin's whitepaper using a stylometric analysis.

All that, and the NS/NS: Nakamoto Satoshi / Nick Szabo. :)

I assign a low probability to the proposition that Nakamoto is other than Szabo.

Suppose Nakamoto isn't Szabo. Why the various similarities? They could be deliberate: Nakamoto isn't Szabo but another researcher who not only runs with the same ideas, but mimics the elements of writing, duplicates the timezone of activity and so on. Even chooses the letters N and S for his pseudonym to tantalize people with the hypothesis that he is Szabo. The problem under this hypothesized scenario is that Szabo would almost certainly have cried foul: "Hey, world, this Bitcoin Nakamoto dude is ripping off my research without crediting me at all!" Secondly, why would someone who wants to create a digital currency system based on Szabo's ideas go to the trouble of creating all these irrelevant similarities.

On the other hand, there is the why: why wouldn't someone who obviously knows a lot about security an privacy issues not put more effort into building plausible deniability? Maybe Szabo simply doesn't care about having anything near air-tight claim that he isn't Nakamoto, and so just let himself be sloppy. Perhaps he actually wants there to be all that circumstantial evidence, and is biding his time until the right moment to admit that he is Nakamoto, at which time with just some small piece of proof, it will be iron-clad to everyone.

numair 3 days ago 3 replies      
It is interesting to note that the only Satoshi-existence scenario whose probability improves with each passing day, is the scenario in which Satoshi / the "Satoshis" is/are dead (unless you think Bitcoin was the first great invention of hyper-advanced AI, of course).

Figuring out whether anyone actually knows who Satoshi is/was becomes more interesting than waiting for the person(s) to show up, because it will never happen in a death scenario. One thing we have gained from this bizarre circus is a sign that neither Gavin nor Jon know who Satoshi might be; who's left? Who might know?

dpweb 3 days ago 1 reply      
This whole "controversy" is incredibly boring. The real Satoshi would be able to prove it easily. Prove it, or STFU and we can move on from the who is Satoshi mystery?
atomical 3 days ago 1 reply      
He played the media like a fiddle. They should follow up with some investigative reporting on him.
jgrahamc 3 days ago 3 replies      
The note is terribly sad. I hope he has people around him to support him.
saalweachter 2 days ago 3 replies      
Thought experiment: Suppose Satoshi, whoever it is, is still alive somewhere in the wild. But he doesn't have his original keys (hard-drive failure with no backup, apartment fire with no offsite backup, etc etc).

How would Satoshi prove himself then?

tunesmith 2 days ago 0 replies      
For those of you who haven't had firsthand experience with truly weird habitual liars, this is exactly what it's like... not saying Wright is a liar but he hasn't differentiated himself from it.
neuropie 2 days ago 2 replies      
It's very interesting that he asked the media to send coins to one of Satoshi's Bitcoin addresses, and that he would return them. All of Satoshi's original coins are Pay-to-Public-Key, whereas the BBC's money was sent Pay-to-Public-Key-Hash (P2PKH). This means that Craig would have to present a signature along with a public key that just has the same hash as the Satoshi public key, not a signature for Satoshi's public key itself. I think he is banking on finding another key with the same hash, as a final desperate attempt to forge his identity.
criddell 3 days ago 0 replies      
His note doesn't make any sense to me.

If I were to give him the benefit of the doubt and assume he is Satoshi, this note still doesn't make any sense. Removing all doubt that he at least controls the keys is almost trivial.

koktang 3 days ago 1 reply      
Is the website image steganographic?
abalone 3 days ago 0 replies      
What is the best theory of how Gavin Andresen and Jon Matonis were deceived?
jbmorgado 3 days ago 2 replies      
I think it follows the typical modus operandi of large part of the Bitcoin community:

1 - Propose flawed argument.

2 - Insert pink unicorns.

3 - When asked for proof, don't give it and instead claim this is exactly what's best for the world but that we are too short minded to understand it.

bpchaps 3 days ago 0 replies      
Odd note - his page just turned from a jpeg to use actual text. So, uh. yay, I suppose.
glaberficken 3 days ago 0 replies      
Link to archived version of full May 3rd posthttps://web.archive.org/web/20160504045648/http://www.drcrai...
Tenoke 3 days ago 1 reply      
Can someone please explain how Gavin can still think he is Satoshi? I am very confused.
oneloop 3 days ago 0 replies      
The words "absolute moron" come to mind.
PhasmaFelis 2 days ago 0 replies      
The entire "who is Satoshi" thing is basically the tech equivalent of gossip rags.
satysin 2 days ago 0 replies      
The guy sounds like a pathological liar and narcissist. Amazing so many were sucked in to his bullshit. The journalists involved should be embarrassed as they look like utter fools.
crystalmeph 3 days ago 3 replies      
Tin-foil, but if I were Satoshi and wanted to remain anonymous, and some journalists started snooping around, one way to shut them down would be to publicly post a "proof" that seems technical enough to get the reporters to run out screaming "we got him!" but falls apart once more technical people start investigating...
DonGateley 1 day ago 0 replies      
Perhaps he was simply clued to some seriously bad consequence of proving his ownership of the Satoshi hoard.
ajonit 2 days ago 0 replies      
I really wish Satoshi comes out from hiding before he (or the group) is dead (assuming he is not already); otherwise it will forever take #1 position on those listicles sites ("The top 10 mysteries of all time; You really want to know #1"
bitmapbrother 3 days ago 0 replies      
I suggested this from the start. A simple transaction from Satoshi's account to an agreed upon 3rd party is all that would have been needed to prove his claims. Claiming to not have the "courage" to perform this simple verification is just ridiculous.
lazzlazzlazz 2 days ago 0 replies      
Nobody here should be surprised; we knew Wright has been engaging in pathological behaviors for years now. The question is how Gavin will handle the damage to his reputation. I feel bad for him.
jheriko 2 days ago 0 replies      
Unsurprising. Being vague and mysterious is the hallmark of bullshit.

The fact that some people believe it is more a testament to their naivete than anything else. I pity them... :(

throw7 2 days ago 0 replies      
"There are very credible people besides Gavin and Jon who still think he is Satoshi - people who are privy to other information and whose judgement I respect."

Oh? Pray tell.

supercoder 3 days ago 2 replies      
It's a suicide note.
biot 2 days ago 0 replies      
In the video he claims that he is the one known by the "monkier" [sic] Satoshi Nakamoto. A very appropriate misspeak of the word "moniker".
fapjacks 2 days ago 0 replies      
Wow, I'm psychic! Turns out all the guy really had was a rubber suit in a freezer. It wasn't a bigfoot after all!!


cogentleman 2 days ago 0 replies      
Haha can't handle the limelight, this fella reminds me of the Flappy Bird creator.

I think he's BSing as well.

ParadisoShlee 3 days ago 0 replies      
thats the kind of apology I would expect from the zodiac killer..
The_knight 2 days ago 1 reply      
I'd be willing to bet that the coins will move soon.
bitmadness 2 days ago 0 replies      
What an ass
jsprogrammer 2 days ago 3 replies      
Humans are not monkies and never were. It is believed that the two species shared a common ancestor at some point.

Edit: If someone has evidence otherwise, please post.

imsatoshi 2 days ago 0 replies      
This guy is not the creator. No body is. Evolution!!
jonah 2 days ago 0 replies      
Bitcoin needs a "For Entertainment Purposes Only" disclaimer on it.
johnjac 3 days ago 0 replies      
We're an anarcho-syndicalist commune. We take it in turns to act as a sort of 'Satoshi Nakamoto' for the week.But all the decision of that Satoshi have to be ratified at a special biweekly meeting.
id122015 3 days ago 1 reply      
these stories about bitcoin are like those long drama's/soap operas/ Tween Peaks, I dont know anything about because I didnt have time to put up with it. Only women..
ebbv 3 days ago 6 replies      
One thing that gives me a little pause and makes me think maybe he is telling the truth (even though all other evidence and logic says he's a fraud) is that the real Satoshi has not put out a statement calling this guy out as a fraud.

When the Dorian Nakamoto stuff happened the real Satoshi put out a statement saying he's not Dorian. That wasn't that long ago.

If he did that you'd think he'd step up and say he's not Craig Wright either. But maybe he did the previous one because he felt Dorian was being victimized whereas Craig Wright has made a mess of his own design.

It's hard to say for sure.

Japan Now Has More Electric Car Charging Spots Than Gas Stations transportevolved.com
357 points by prostoalex  3 days ago   113 comments top 16
kamran20 3 days ago 3 replies      
Unlike the majority of gas stations in Japan however, the 40,000 electric car charging points quoted by Nissan includes ones in private homes, causing some critics to cry foul. After all, if a charging station is hidden in a privately-owned garage, it isnt easily accessible to the public.Yet while we understand that criticism and its why we used an asterisk in our headline the rise of charger-sharing sites like PlugShare.com means that more people than ever before are offering their private charging station for others to use, either as an altruistic gesture or for cold, hard cash. Moreover, its possible to argue that because privately-owned charging stations are enabling owners to drive their cars without visiting public charging stations, theyre providing just as valid a service to everyday drivers as publicly-assessable, higher-powered ones.But while electric car charging stations may now be far more common in Japan than a gas station, the numbers of electric cars on the roads of Japan still represent a tiny proportion of the total cars registered.It includes public and private charging spots but the article does raise some good points about 'charge spot pooling' and reducing the dependency on petrol stations. Interesting times ahead!
Animats 3 days ago 0 replies      
Here's the interactive map.[1] Most of those charging stations are not public. Hotels and auto dealerships have some of them. I've been looking for them in Google Maps. Here's one that's in a public parking lot and has signage.[2] You have to pay for parking to charge, but that's not unreasonable in Tokyo.

[1] http://www.chademo.com/wp/jpmap/[2] https://goo.gl/maps/M5WYLuM4uEK2

anexprogrammer 3 days ago 2 replies      
This is hardly surprising. Or terribly newsworthy. Just reinforces how early we still are in the adoption of electric vehicles.

The amount of gas stations available has plummeted in most places over the last 20 years. I presume Japan is similar. It's been equally down to supermarkets getting into petrol supply and the oil companies introducing large weekly mininimums. UK probably has 1/2 or 1/3 the number of petrol stations compared to y2k. It's ruined the convenience of filling up and tiny 1 and 2 pump stations are consigned to history. Live somewhere rural? Now you can drive 20 miles just to fill up.

Meanwhile charging points are being put in everywhere - towns are making charging bays in car parks, or on street. Much like happened with petrol in the 1920s.

Oh, and let's not forget a charging point caters for a lot fewer vehicles than a petrol station.

duaneb 3 days ago 3 replies      
It is a strange experience to be jealous of countries with high population density. However, every time bicycles, electric vehicles, or renewable energy comes up, I feel profoundly embarrassed the US can't form a similar effort given our resources, even if I rationally understand it's probably not worth it to have a high speed train hit all the small towns in Montana, or how solar panels might not work as well in Forks, Washington as they do in California.
wiz21 3 days ago 1 reply      
Interestingly, Japan oil consumption felt by +/- 18% since 2000 while population remained stable.



mikeash 3 days ago 0 replies      
As I write this comment, this story is on the front page just above the story titled "Misperceiving Bullshit as Profound." I find the juxtaposition amusing. I'm a huge proponent of EVs, but this comparison is meaningless and silly.
pnewman3 2 days ago 1 reply      
I just came back from Japan, and saw very few EVs on the road, just one or two Nissan Leafs. By a wide margin the cars you mostly see on the road there are kei-cars, which are small cars with 660cc engines. They are cheap, tax-advantaged, offer good fuel economy, and are surprisingly roomy. Until there is a good EV kei alternative I don't think EVs will really take off in Japan. Hybrids are pretty common though.

I also saw several mobile charging spots at convenience stores though, which is brilliant, and some larger car parks covered with solar panels. Being able to get charging points at convenient locations using the existing grid, and thus doing away with specific trips to the gas station and fuel distribution logistics will be a huge advantage eventually.

im3w1l 3 days ago 1 reply      
More than one pump per gas station. Still very impressive!
adventured 3 days ago 4 replies      
This claim is only true because they included private garage based charging, such as in a home. A pretty big stretch in trying to generate a headline.
frgewut 3 days ago 1 reply      
I would be more interested in statistics about Norway as 30% of new cars are electric there.
dbalan 3 days ago 1 reply      
The actual data worth looking will the percentage of electric vehicles to ones that run on conventional fuel.

[1] https://en.wikipedia.org/wiki/Plug-in_electric_vehicles_in_J...[2] https://en.wikipedia.org/wiki/Electric_car_use_by_country#Ja...

datsun 3 days ago 0 replies      
Why don't gas stations offer quick charging spots by now I wonder? Why are they not interested in attracting EV owners?
ck2 3 days ago 0 replies      
Sometimes revolutions happen quietly, very quietly.

(actually, that could be an electric car company slogan)

piyushpr134 3 days ago 0 replies      
Japan has quietly done it and forgot to tell the world!
akafred 3 days ago 3 replies      
Cool, lots of cars running on nuclear energy, then!
kubatyszko 3 days ago 1 reply      
and Tesla has barely THREE charging stations in Tokyo... (2 are in "center" and one on the periphery).
Why Wind Turbines Have Three Blades cringely.com
414 points by rfreytag  2 days ago   182 comments top 31
Cerium 2 days ago 10 replies      
Three blades is the smallest number that reduces the vibrations due to the blades crossing the support structure. When a blade crosses the support its applied force is reduced because the wind is slower around the support. This reduction in forces creates a yawing torque that can lead to unwanted vibrations. Much of the structural stiffness and bearing requirements are related to these effects. Three blades minimizes the effect because when one blade crosses the support the other two blades are out in a Y shape, shortening the force differential when compared to two blades.
scblock 1 day ago 4 replies      
This is only partially related to my other long post here, but it should be mentioned. This article appears to be largely about potential to disrupt the wind industry with outside the box thinking. But in the more than a decade I have been working in this industry the single most disruptive change in project performance was targeting low wind speeds rather than high, and accepting that turbines will have to shut down in higher winds.

The main way this is achieved is to take a huge rotor and put it on a small generator. More energy is lost at high winds, but the ramp up from no generation to maximum is much faster. Since most projects spend the majority of their time along that ramp rather than at rated power this results in large gains in annual energy.

This means that some sites once considered to be marginal to end up being good to very good. One area I've been working in saw a more than 25% increase in predicted energy for the same capital cost with the new turbine types.

mapt 2 days ago 7 replies      
* I don't understand his explanation of starting torque. You can jump-start that problem by other means in the alternator. Torque and power should be decoupled anyway by blade pitch.

* There is such a huge difference between a 12m blade and a 60m blade that I don't see how the comparison is at all relevant. We played with smaller turbines for decades before we reached this sort of price efficiency.

* Betz' law explicitly disavows picking a number of blades: "Assumptions: 1. The rotor does not possess a hub and is ideal, with an infinite number of blades which have no drag. Any resulting drag would only lower this idealized value. ..."

* Betz' law is a three-dimensional consequence of convervation laws, not an observation about turbulent blade interactions. "Consider that if all of the energy coming from wind movement through a turbine was extracted as useful energy the wind speed afterwards would drop to zero. If the wind stopped moving at the exit of the turbine, then no more fresh wind could get in - it would be blocked. In order to keep the wind moving through the turbine there has to be some wind movement, however small, on the other side with a wind speed greater than zero. Betz' law shows that as air flows through a certain area, and when it slows from losing energy to extraction from a turbine, it must spread out to a wider area. As a result geometry limits any turbine efficiency to 59.3%."

* We already have a good idea what a moderate redesign of the fundamentals of a wind turbine looks like; It points downwind, it's huge like the existing turbines, and its two blades bend. They just need to solve the tower strike problem. https://www.technologyreview.com/s/401583/wind-power-for-pen...https://www.technologyreview.com/s/528581/two-bladed-wind-tu...

calinet6 2 days ago 2 replies      
This is a classic example of sub-optimization (and even if it's hand-wavey pseudo-science and may be wrong, the general concept is still interesting).

Each individual wind turbine is optimized to be the "most efficient" it can possibly be.

But not in the context of the environment, which requires complex control systems and methods to reduce damage in non-optimal conditions.

And not if you take cost and efficiency of the whole system into account; where four times as many small turbines with less complexity run more often and produce more overall output.

But hey, each wind turbine is "optimal." Interesting.

Most complex systems have this property. Even (especially?) your business.

neilk 1 day ago 3 replies      
Erm, I don't know if I trust Cringely when it comes to this sort of thing. He sold a reality show to PBS, "Plane Crazy (1998)"[1] with the promise he could design and build a small plane in 30 days that was unlike any other that had ever been built.[2]

His engineer just flat out refused to produce any plans for this design. He tried to make something like his plane happen anyway, going without sleep for weeks, and had a meltdown on camera. The show eventually had to be about him giving up on his cherished design and building a throwback to biplanes, with a family of artisans who are experienced in building them.

I had the feeling that Cringely was trying to channel Steve Jobs and it didn't work out. (Note: Jobs drives other people to do impossible missions without sleep, not himself, and doesn't display his work until it's ready, giving the appearance of effortless superiority).

Anyway as for this wind farm idea, the evidence of his friend's alternative propeller design is interesting. But let's also note that that this friend chose not to pursue wind farms, maybe for a reason.

[1] http://www.pbs.org/cringely/pulpit/1998/pulpit_19980724_0005...

[2] EDIT: I am not an aviation person, and my memory of this is poor, but from Googling I see apparently he wanted to make it out of unusual materials, have foldable wings, and put the engine behind the pilot. I have a memory of his design requiring the pilot to straddle the drive train to a front-mounted propeller. I don't know how problematic that is, but I remember it as being presented as a major problem.

EDIT 2: To be honest I feel a bit bad now about this being a top-voted comment, as Cringely learned a lesson on camera that many startup people learn behind closed doors confidence is great, but trying to innovate in too many directions at once will kill you. Maybe that means that he is more cautious now, so perhaps he's really sure this thing will work. On the other hand, it was the first thing I thought of, that maybe he doesn't have a great track record with aviation iconoclasm.

Houshalter 2 days ago 5 replies      
Slightly relevant, here's a really weird wind turbine designed by a computer: http://m.youtube.com/watch?v=YZUNRmwoijw
lorenzfx 2 days ago 1 reply      
I'm not saying he is wrong, but he makes some bold claims without giving any proof.

Some examples:

> Twelve blades is a nice number.

Why twelve? Why not 50, it's a nice number as well.

> Lipps turbines can operate in faster winds [...] turbines could be allowed to run 24/7 in any wind with no computer

So even in the strongest winds turbines with 40 feet blades do not need to be stopped?

> using permanent magnet generators instead of alternators, but those are more expensive> Use permanent magnet generators leading to [...] even lower cost.

Which is it now?

> what matters isnt power efficiency per turbine so much as power production per acre of wind farm.

Isn't it rather electricity production efficiency in terms of invested capital?

Jedd 2 days ago 4 replies      
Looks like a fine opportunity to ask - whatever happened to vertical axis wind turbines?

They sounded like they had a bunch of advantages, not least a much higher tolerance for extreme wind conditions, less stress on long (suspended) blades, possibly less gearing issues in translating motion back to the ground - but presumably they had / still have distinct disadvantages?

nabla9 2 days ago 1 reply      
One graph is worth thousand words.

Rotor power coefficient vs. tip-speed ratio.


dzdt 2 days ago 0 replies      
We all love the story about the clever little guy who thinks outside of the box taking down the big giant corporation who does things the way they have always been done. Cringely trys to tell this as such a story, or could be, or could have been. But there is really no evidence backing it, just his desire to tell a good story.
vanderZwan 2 days ago 0 replies      
Relevant Low-Tech Magazine entries on wind-power, and small windmills:

Urban windmills harm the environment [0]

> A small windmill on your roof or in the garden is an attractive idea. Unfortunately, micro wind turbines deliver hardly enough energy to power a light bulb. Their financial payback time is much longer than their life expectancy and in urban areas they will not even deliver as much energy as was needed to produce them. Sad, but true.

Small windmills put to the test [1]

> A real-world test performed by the Dutch province of Zeeland (a very windy place) confirms our earlier analysis that small windmills are a fundamentally flawed technology

(Note that the picture shows that almost all windmills tested had three blades)

Wind powered factories: history (and future) of industrial windmills [2]

> In the 1930s and 1940s, decades after steam engines had made wind power obsolete, Dutch researchers obstinately kept improving the already very sophisticated traditional windmill. The results were spectacular, and there is no doubt that today an army of ecogeeks could improve them even further. Would it make sense to revive the industrial windmill and again convert kinetic energy directly into mechanical energy?

Unlike this story, which certainly sounds interesting but shares no real data to back it up, Kris de Decker thoroughly digs through sources to write articles backed up by available data as best as possible.

[0] http://www.lowtechmagazine.com/2008/09/urban-windmills.html

[1] http://www.lowtechmagazine.com/2009/04/small-windmills-test-...

[2] http://www.lowtechmagazine.com/2009/10/history-of-industrial...

Animats 1 day ago 0 replies      
In the 1970s and 1980s there was much more variety in wind turbine design. There were two bladed machines, multi-blade ducted turbines, Darrieus rotors, and other exotic technologies. Outputs were in the 50KW range. Pacheco Pass in Northern California had examples of most of those. Some didn't work too well. Loss of blade accidents were common in the early days, with blades thrown considerable distances.

The three-bladed machines won out commercially. Machine size went up because output vs cost decreases with size, at least up to 1-2 MW. Lots of little machines were a pain to install and maintain.

Wind generators used to be AC generators synchronous to the grid. But with higher power semiconductors available, putting a big AC-DC-AC converter on the output to sync it to the grid is becoming popular.[1][2] This allows generating some power during low-wind conditions, and provides much more adaptability to wind gusts. When the wind speed changes, the blade pitch is adjusted to compensate, but on big turbines, this takes tens of seconds. Being able to adjust electrically in milliseconds avoids power grid transients.

The push for permanent magnet motors in wind turbines is more about converting to direct drive and getting rid of the gearbox. Wind turbine gearboxes are a huge pain, wearing badly for reasons that were only understood in the last few years.

[1] http://www.theswitch.com/wind-power/[2] http://new.abb.com/motors-generators/generators/generators-f...

_Codemonkeyism 2 days ago 2 replies      
As a kid I always found it interesting that (some?) Spitfire had 4 blade propellors and the main adversary, the [Edit] Bf109 had 3 [1] and wondered, shouldn't there be an optimum?

[1] Not sure both changed the number of blades during WWII.

justinph 2 days ago 0 replies      
Small turbines do exist. One was installed on This Old House recently: http://www.thisoldhouse.com/toh/tv/ask-toh/video/0,,20961006... jump to about 15:00)

But, it does have three blades.

Aelinsaar 2 days ago 2 replies      
I like the idea of wind turbines, but in practice I think PV is the way to go (long term). I realize that for now it's a blend of technologies, but PV doesn't kill millions of birds and bats. I grant you, it's better than burning coal, but still we can do better eventually.
scblock 1 day ago 2 replies      
There is a lot of bad information here, and it appears there are others in these who are also better informed than the author. Let me just hit up a couple of points that are fundamentally wrong, though.

- "Conventional wisdom says wind farms should have their turbines placed in such a way that they dont interfere with each other, the fear being that placing one turbine too closely in the shadow of another will reduce the efficiency of the showed turbine."True, and this is why we have wake models based to predict the losses from other turbines and optimize placement.

- "The rule of thumb, then, is that turbines be placed no closer than seven diameters apart. Keep that number in mind."Not true. You may find that a 7x7 array is relatively common in offshore applications, but a typical onshore application is more likely to be between 2-4 diameters apart in a row, with rows 7-13 diameters apart front to back.

- "Oh, and turbines are placed seven diameters apart. Thats it, no CFD."Wrong. But CFD is generally computationally complex, so we usually use models with reduced fluid dynamics equations to make it possible to iterate quickly. See previous comment about wake models.

- "In some cases wind farm automation can cost as much as the turbines, themselves."I'd like to see these magical cases. A typical 2 MW turbine costs $2 million to purchase, and about $3-3.5 million total as part of an overall project of 50 turbines. SCADA is a minor fraction of this, as is operations.

- "Shorter blades are stronger than longer blades, so the Lipps turbines can operate in faster winds."This is a non-issue. There are very few sites in the world that require even the highest wind speed turbine designs; most of the world is less windy, and the majority of sites benefit from using turbines designed for lower winds.

- "Use permanent magnet generators and the turbines could be allowed to run 24/7 in any wind with no computer control required at all, leading to more production at even lower cost."Computer control of turbines is a non-issue, and the cost is minimal relative to the raw materials cost of the machines.

- "This is because they use alternators that consumer electrical power to energize their windings so there is no point in turning-on the alternator (energizing those windings) until theres enough wind to generate a net positive amount of electricity."This is only a little bit correct, and mostly not. Wind turbines by design generally need to be connected to the grid to run, but winding energization is not why they don't start generating until there is enough wind. Turbines are generally on all the time, and typically consume anywhere from 10 to 50 kW at idle. And I don't know why he's using the term "alternators" to describe the turbine generators. The most common generator type is a doubly-fed induction generator, but squirrel cage induction generators, permanent magnet generators, and synchronous generators are often used. Usually turbines are connected to the grid through power converters which allow them to run at various speeds while remaining electrically synchronized with the grid.

- "Remember the diameters are smaller so instead of hundreds of turbines were talking about thousands of turbines for the same wind farm. Imagine a field of mature dandelions."This is actually a problem. When you can get 100 MW with a 50% capacity factor by building 50 machines in one township in Nebraska, why would you want to build 1,000 machines instead? How is that less complex?

- "Try breaking into the industrial wind power business without at least $1 billion in capital. It cant be done. The incumbent companies like it that way, too."Manufacturing is capital intensive. News at 11.

- "Lipps wind farms could be closer to cities and therefore have lower transmission losses, further increasing power output."Wind farm placement is about where the wind resource is. It's an economic decision.

- "The result of all this not starting and then stopping is that throughout the year an average workload of 23 percent is reached by inland wind farms, 28 percent for coastal farms and 43 for off-shore."I have to assume his "average workload" here (a term I've never heard in the industry) is equivalent to capacity factor, which is the ratio of actual energy produced to the maximum possible. Most new projects in the windy areas of the US have predicted capacity factors of greater than 40 percent. High wind losses are typically very low, and online time is typically very high.

- "China will build the heck out of those smaller blades."China is also building the heck out of the larger blades. China has more wind capacity installed than any other nation.

- "And no insane cows, either. Cattle cant be pastured under wind farms because the motion of the turbine blades and especially their sound drives cows crazy."Tell that to these cows https://www.flickr.com/photos/ashcreekphoto/7793429362 (did he even do an image search before posting that? I've built wind projects on cattle ranches.)

Why does this post use such an old, crappy US wind map? Why not the newer DOE wind maps available at http://apps2.eere.energy.gov/wind/windexchange/wind_maps.asp

I'm sure I could go on, but this is just a fundamentally misinformed article. It's trying to make an aerodynamic argument (which I am not qualified to judge) using a mess of bad or incorrect information.

tantalor 2 days ago 0 replies      
The 2016 tag is not really necessary; this was published today.
toolslive 2 days ago 0 replies      
"why... very often the turbines arent turning at all?"Someone told me that demand varies and it's easier to shut down a windmill than a nuclear reactor.
tlb 2 days ago 2 replies      
More blades and closer spacing between windmills optimizes power / land area. But what matters is power / capital cost.
_Codemonkeyism 2 days ago 0 replies      
I'm interested in the results when Cringely builds his farm. For example support costs from thousands instead of hundreds of turbines. How many on one pole are best. Also building costs over time e.g. when costs go down due to robots planting poles etc. which favors farms with more poles above those with fewer poles.
yason 1 day ago 0 replies      
I don't know much about wind turbines and the behaviour of moving volumes of air with regard to airfoils but there's one point I picked off of the article.

If I were to put my money either in the most advanced big-farm wind high-tech or in something that is decentralized, mass-produced, and well-abused by all kinds of groups of people, it would be the latter.

The article mentions that one billion is not enough to enter the game. That surely excludes a lot of the smartest and brightest people who might come up with new innovations and the billion-scale investments also seek conservative returns, further culling new ideas.

stcredzero 1 day ago 0 replies      
This comment on the post is pure gold!

Fascinating! Its not about asking questions, its about asking the right questions! The first framing question is efficiency from blade to outlet, but its really about effciency from capital markets to factory floor to farm to outlet.

That should become a mantra. It's about efficiency from capital markets to revenue per customer.

elcapitan 2 days ago 0 replies      
I would have guessed that it's a trade-off between efficiency and mimizing public outrage in densely populated areas (yes, outside the US that can be an issue). Three blades, when the turbine is not working, block less from the view than a large number of blades (which converge towards giant white surfaces).
at-fates-hands 2 days ago 1 reply      
Here's a much better Wind Resource Map than the one in the article: http://www.tindallcorp.com/site/user/images/USA_Wind_Map_for...
coldcode 1 day ago 0 replies      
The more worrying part of this is that you need 1H capital to even try to get into this business if all you do is build the monsters. So that means few competitors (or even one) which makes monopolist behavior and unimaginative thinking likely.
manmal 2 days ago 0 replies      
It seems Steve Jobs was inspired by Socrates: https://en.m.wikipedia.org/wiki/Socratic_method
xutopia 1 day ago 0 replies      
In some parts of the world they use 2 blades so they can lay down the system when the wind is too high (think tornado/hurricane season
trhway 1 day ago 0 replies      
The less number of blades - the higher efficiency, should be an odd number to avoid standing wave and symmetric too. Thus 3.
mrfusion 1 day ago 0 replies      
Why can't the blades be staggered so they don't follow in each other's wake?
eonw 1 day ago 0 replies      
this is one of the best articles i have read recently. I too am a big fan of always asking why and trying to buck the trend of "thats just the way it is".
NVIDIA Announces the GeForce GTX 1000 Series anandtech.com
379 points by paulmd  1 day ago   203 comments top 31
onli 1 day ago 3 replies      
This article buys in in the hype. It reads like they just copied statements of the press release. The article on anandtech[0] is a lot better.

Also, do not forget that benchmarks made by Nvidia mean nothing. How fast the cards really are will only be clear if independent people do real benchmarks. That's true for games, but it will also be true for all the speculation found here over the ML performance.

[0] http://anandtech.com/show/10304/nvidia-announces-the-geforce...

neverminder 1 day ago 7 replies      
What really blew me away is that they went straight for Displayport 1.4 (https://en.wikipedia.org/wiki/DisplayPort#1.4) that was only announced on 1st of March 2016. DP 1.3 was approved on 15th of September 2014 and as of today there are no cards supporting it except this one (with backwards compatibility).

The bad news is that there's no news about new gen displays that could take advantage of this graphic card. I'm talking about 4K 120Hz HDR (https://en.wikipedia.org/wiki/High-dynamic-range_rendering) displays. This is a total WTF - we have a graphic card with DP 1.4 and we don't even have a single display with so much as DP 1.3...

mrb 1 day ago 4 replies      
Note that in terms of pure compute performance the new 16nm Nvidia GTX 1080 (2560 shaders at up to 1733 MHz = 8873 SP GFLOPS) barely equals the performance of the previous-generation 28nm AMD Fury X (4096 shaders at up to 1050 MHz = 8602 SP GFLOPS). Of course the 16nm chip does so at a significantly lower TDP (180 Watt) than the 28nm chip (275 Watt), so it will be interesting to see what Nvidia can achieve at a higher thermal/power envelope with a more high-end card... I am waiting impatiently to see how AMD's upcoming 14nm/16nm Polaris chips will fare, but from the looks of it it seems like Polaris will beat Nvidia in terms of GFLOPS per Watt.
gavanwoolery 1 day ago 2 replies      
Worth noting - IIRC the stream mentioned that it could do up to 16 simultaneous projections at little additional performance cost. This is important for VR...a big part of the cost, when you are dumping many vertices to the GPU, is performing a transform on each vertex (a four component vector multiplied by a 4x4 matrix) +. Even bigger cost comes from filling the resulting polygons, which if done in two passes (as is fairly common) results in something that violates cache across the tiles that get filled. So, in other words, its expensive to render something twice, as is needed for each eye in VR - from what they have shown, their new architecture largely reduces this problem.

+ This is a "small" part of the cost, but doing 5m polygons at 60 fps can result in about 30 GFLOPS of compute for that single matrix operation (in reality, there are many vertex operations and often many more fragment operations).

frik 1 day ago 2 replies      
I am waiting for the Nvidia GTX 1070, and sincerely hope NVidia doesn't fuck it up again like the GTX 970.

GTX 970: It was revealed that the card was designed to access its memory as a 3.5 GB section, plus a 0.5 GB one, access to the latter being 7 times slower than the first one. -- https://en.wikipedia.org/wiki/GeForce_900_series#False_adver...

paperwork 1 day ago 5 replies      
Can someone provide a quick overview of the current GPU landscape?

There seems to be Nvidia's pascal, gtx, titan, etc. Something called geforce. And I believe these are just from Nvidia.

If I'm interested in building out a desktop with a gpu for:1. Learning computation on GPU (matrix math such as speeing up linear regression, deep learning, cuda) using c++112. Trying out oculus rift

Is this the right card? Note that I'm not building production models. I'm learning to use gpus. I'm also not a gamer, but am intrigued by oculus. Which GPU should I look at?

tostitos1979 1 day ago 5 replies      
I was helping a friend put together a Titan X "rig" and we realized that case space, power supply and motherboard slots were some mundane but frustrating challenges. For someone building out a new rig for personal/hobbyist work in deep learning, any recommendations? Is the best setup to get two 1080s, 16-32 GB of RAM and a 6th generation i7?
marmaduke 1 day ago 2 replies      
The lower TDP is just as significant as speed. I've got a pair of GTX 480 that I can use to heat my office with a light workload. How many 1080s could run in a single workstation?
jkldotio 1 day ago 1 reply      
Isn't the Titan X also about RAM? It has 12GB to the GTX 1080's 8GB. At the low end you could just buy more than one GTX 1080, so it looks like a good deal there, but at the top end you are running out of slots for cards.
voltagex_ 1 day ago 2 replies      
I wonder what the idle power usage of one of these would be? I wish my motherboard allowed me to turn off the dedicated card and fall back to the Intel chipset when I didn't need the performance.
kayoone 1 day ago 5 replies      
I am just curious (and a total machine learning novice). If you were to experiment with ML, what are the benefits to getting a fast consumer card like this or use something like AWS GPU instances (or some other cloud provider) ?Or phrased differently: When does it make sense to move from a local GPU to the Cloud and vice versa ?
tjohns 1 day ago 2 replies      
All the benchmarks I can find are comparing this against the GTX 980. I'm curious how it compares to a 980 Ti.
drewg123 1 day ago 2 replies      
Does anybody know what kind of performance a modern Nvidia card like this can provide for offloading SSL ciphers (aes gcm 128 or aes gcm 256)?
mioelnir 1 day ago 1 reply      
Towards the end of the article in the table overview, they list a `NVIDIA GP100` model with a memory bus width of 4096 bit. It is still shared memory, but considering bcrypt only requires a 4k memory working set, that now fits into 8 cycles instead of the 128 of 256 bit bus architectures...

Am I wrong to think this card could really shake bcrypt GPU cracking up?

tormeh 1 day ago 0 replies      
I'll buy when there's Vulkan/DX12 benchmarks and real retail prices for both AMD and NVIDIA next-gen cards. Buying now seems slightly premature.

But oh man am I excited!

rl3 1 day ago 0 replies      
I'm somewhat dismayed that the GTX 1080 has only 8GB of VRAM considering that previous generation AMD products already had 8GB, and the previous generation Titan model had 12GBthe latter of course being outperformed by the GTX 1080.

Then again, rendering a demanding title at 8192x4320 on a single card and downsampling to 4k is probably wishful thinking anyways. However, it's definitely a legitimate concern for those with dual/triple/quad GPU setups rendering with non-pooled VRAM.

On the bright side, 8GB should hopefully be sufficient to pull off 8k/4k supersampling[0] with less demanding titles (e.g. Cities: Skylines). Lackluster driver or title support for such stratospheric resolutions may prove to be an issue, though.

It's possible Nvidia is saving the 12GB configuration for a 1080 Ti model down the road. If they release a new Titan model, I'm guessing it'll probably weigh in at 16GB. Perhaps less if those cards end up using HBM2 instead of GDDR5X.

[0] https://en.wikipedia.org/wiki/Supersampling

ChuckMcM 1 day ago 0 replies      
This has a similar leap to the one I felt when the 3Dfx Voodoo 2 SLI came out. The possibilities seem pretty amazing.

I'm interested to know how quickly I can plug in a machine learning toolkit, it was bit finicky to get up and running on a box with a couple of 580GTs in it but that might just be because it was an older board.

partiallypro 21 hours ago 1 reply      
I'm more interested on the impact it will have on the 10 series as a whole. I literally just bought a 950, now I'm wondering if the 1050 will be priced just as reasonably and offer big performance gains. Also, what is the time table for the rest of the 10 series?
Szpadel 1 day ago 0 replies      
> The GP104 Pascal GPU Features 2x the performance per watt of Maxwell.


> The GTX 1080 is 3x more power efficient than the Maxwell Architecture.

I think that someone get carried away by imagination.

I found that 980 has 4.6 TFLOPs (Single precision)And assuming that 1080 performance (9 TFLOPs) is also for single precision and new card has the same TDP, this is 1.95x increase, so it is ~2x

EDIT: I found that 1080 will have 180W TDP, where 980 has 165W, so correction it will be 1.79x increase

kosmic_k 21 hours ago 1 reply      
Does anyone know why video cards have stayed on a 28nm process for so long? It's appears that a significant factor in this incredible leap of performance is the process change, but I'm puzzled as to why 22nm was skipped.
agumonkey 1 day ago 1 reply      
Not long ago I found this 2009 project https://en.wikipedia.org/wiki/Fastra_II

built to give a 'desktop' cluster class performance based on GPUs. I wonder how it would fare against a 1080.

MichaelBurge 1 day ago 1 reply      
How good does it look for a hobbyist manipulating large matrices for use in machine-learning?
valine 1 day ago 0 replies      
I'm hopeful this will provide a significant speed up for my 3D modeling / rendering. The number of cuda cores is only slightly higher than the 780. I'll definitely wait for more benchmarks.
science404 1 day ago 2 replies      
No word on double-precision performance? Could replace the Titan Z and Titan (black) as a good CUDA-application card..
cheapsteak 1 day ago 3 replies      
Could someone clarify if they meant that one 1080 is faster than two 980s in SLI? Or did they mean two 1080s were faster than two 980s?
tronreg 1 day ago 3 replies      
What is the "founders edition"?
interdrift 1 day ago 0 replies      
This will be pretty nice for VR cause it will push older generation cards back in price
rasz_pl 1 day ago 1 reply      
and still not fully DX 12, oh nvidia
mtgx 1 day ago 2 replies      
"GTX 1080"

"10 Gaming Perfected"

Seems like a missed marketing opportunity to make it a 10 GFLOPs card.

BuckRogers 1 day ago 2 replies      
I watched the livestream, looked good and love the performance/watt. 180watt card. Way more GPU power than I need professionally or for fun though. I'm actually all-in on the new Intel gaming NUCs. Skull Canyon looks fantastic and has enough GPU performance for the gaming I do anymore. Mostly older games, some CS:Go (my friends and I play at 1280x1024 anyway since it's competitive gaming) and League of Legends.

It's also nice to have an all Intel machine for Linux. I'd use a lowend NV Pascal to breathe new life into an older desktop machine since NV seems to always have a bit better general desktop acceleration that really helps out old CPUs. If building a high end gaming rig I'd probably wait for AMD's next chip. I've liked them more on the high end for a few generations now. Async compute and fine-grained preemption, generally better hardware for Vulkan/DX12. AMD is also worth keeping an eye on for their newfound console dominance, subsequent impact if they push dual 'Crossfire' GPUs into XboxOne v2, the PS4K and Nintendo NX. That would be a master stroke to get games programmed at a low-level for their specific dual GPUs by default. Also, the removal of driver middleware mess with the return of low-level APIs to replace OGL/DX11 will remove the software monkey off AMD's back. That always plagued them and the former ATI a bit.

I'll probably buy the KabyLake 'Skull Canyon' NUC revision next year and if I end up missing the GPU power, hook up the highest end AMD Polaris over Thunderbolt. Combining the 256MB L4 cache that Kabylake-H will have with Polaris will truly be next-level. Kaby also has Intel Optane support in the SODIMM slots, it's possible we'll finally see the merge of RAM+SSDs into a single chip.

But more than anything, I want Kabylake because it's Intel's last 14nm product so here's to hoping they finally sort out their electromigration problems. Best to take a wait and see on these 16nm GPUs for the same reason. I'm moving to a 'last revision' purchase policy on these <=16nm processes.

amelius 1 day ago 4 replies      
Why are these coprocessors still positioned as graphics cards?
Our 2016 Open Source Donations duck.co
385 points by wicket  1 day ago   50 comments top 13
MichaelBurge 1 day ago 10 replies      
Donating to open-source seems like such a good use of charity money. I never give to charity because it always seems so abstract, or there might be better ways to solve the problem; with open source people are usually laboring over it with no recognition, and even a little seems like it has such high marginal benefit.

I'm in-between jobs right now(occupied with a side project), but at some point I'd really like to fund feature development on some open source projects:

* GNUCash has a solid heart, but has some usability issues that make it a pain to use in practice.

* Freenet last I checked had only 1 fulltime developer. And he's probably taking a serious cut to market salary to work on it.

* GHC could stand to have some performance optimization done on the compiler.

* Inkscape or GIMP are handy to have around. Inkscape even has a page describing how you can host a fundraiser for targeted feature development, which is very rare for open-source.

* I don't know that TOR needs much software help, but I wouldn't mind funding some exit nodes. It'd be nice if you could buy a locked-down black-box exit node that you could plug into your wall or something, that was guaranteed not to incriminate you. Maybe outside the scope of this, though.

* Everyone has a little app or site they use where a few people are working without much benefit to maintain something you use all the time.

Is there a good list of needy open-source software?

frik 1 day ago 2 replies      
The donations are great for open source projects.

I just wonder if DDG is investing in their own crawler? With Yahoo BOSSS API bite the dust therefor loosing access to Bing search result. DDG nowadays has to rely on Yandex (Russian search engine) for their search result. The search results of DDG have gone from okay to a bit worse, so what's their long term strategy? Stay a meta-search-engine, or invest major resources to crawl the web themselves? What many people hate is high latency and to lower latency you cannot rely on third party APIs for the main search.

levemi 1 day ago 4 replies      
I'm surprised DDG can donate so much money to open source. They must be doing well? Or is this money they've helped raise from their users? $225,000! That's a lot of money.
kriro 1 day ago 1 reply      
It seems like a pretty great list at first glance. Lots of free speech and/or crypto related projects. Good job DDG, once again doing it right. I hope this generates some good PR for you :)

[Now hurry and make DDG better for non-English languages, I want to use it for everything and more importantly want to make it the default for my parents, friends etc...well I guess strategically it doesn't make much sense if the target niche is developers but one can hope :D]

LeoPanthera 1 day ago 2 replies      
I'm surprised to learn that Freenet is still a thing. What do people use it for?
analognoise 1 day ago 0 replies      
I wish somebody would just give KiCad a few million dollars and we could be free of OrCad/Altium/Etc
aavotins 1 day ago 0 replies      
Very nice. More companies should come up with posts like this to a) spread awareness b) encourage more people to work on open source projects(when there's money on the table). That would also clear up the question of how open source software can be sustainable and where do funds come from.
Lxr 1 day ago 2 replies      
Wow, DDG has come a long way since I was last there a few years ago. It's great to have a serious player in the search space dedicated to privacy. After spending some time using it just now their ranking algorithm still feels somewhat inferior to Google though. This is not surprising given the resources Google devotes to machine learning and the like I guess. Does anyone know roughly how DDG arrive at their results?

On topic, I think their donation is a great move.

giulivo 1 day ago 0 replies      
Now... this is certainly great, especially given the particular selection of projects receiving the donations. I've upvoted the article and hopefully more companies (aka open source consumers) will do the same.

But then I wonder, what is that the community would benefit most from? Money or actual code contributions?

I develop open source software and I get paid for doing it, which is a great luxury. Go DDG, raise the bar even further, employ people working on these projects if you don't have any already and even release more software!

lowglow 1 day ago 1 reply      
I'd love to bring really important open source projects to baqqer and support them with monthly pledges. I'd like them to be transparent and open with their direction and progress, while simultaneously letting the community there help guide and support their efforts. Does anyone know any great open source projects that would be open to this?
agentgt 1 day ago 1 reply      
I know duck is all about the trust but couldn't they have picked some non encryption/security OSS projects. It sort of reminds of the brief period where all these sustainability/eco startups where picked for awards or help.
homero 1 day ago 1 reply      
Where are they getting money
j0e1 1 day ago 0 replies      
SpaceX: Landing confirmed twitter.com
395 points by ajdlinux  2 days ago   1 comment top
dang 2 days ago 0 replies      
Comments moved to https://news.ycombinator.com/item?id=11642855. We don't usually merge an older thread into a newer one, but people obviously want to keep discussing this, so it seems like the best way to avoid more duplicate posts.
Redis 3.2.0 is out antirez.com
361 points by mohamedbassem  2 days ago   65 comments top 8
koolba 2 days ago 4 replies      
Redis is in a very small category of software that's been pleasant to work with from day one and continues to be on a daily basis.

Kudos to antirez on everything before this and another great release.

cft 2 days ago 6 replies      
I like their new GEO sorted set with latitude and longitude. Can someone point me to a [reasonably priced] geocoding database to display nearest Town, Region (only where appropriate), Country (in English) based on latitude and longitude?
xomateix 2 days ago 1 reply      
Just realised that http://download.redis.io/ is not available under https (neither is http://redis.io).

It may be worth downloading it from github.

justjico 2 days ago 0 replies      
If I was a betting man, I would bet the new "interesting feature" is related to time series. Or maybe that's just my wishful thinking.
kinkdr 1 day ago 0 replies      
I've been using RC for a couple of months now in production, albeit my site is very small, and has been proven rock-stable. And the code is a pleasure to read. Amazing piece of software! Keep up the good work!
matt2000 1 day ago 1 reply      
The new BITFIELD seems like it could be really cool, but I can't quite figure out the use case. Anybody planning on using that that can enlighten me? Thanks.
aeharding 2 days ago 4 replies      
The geo API seems rather... specific. Is there really a demand for it?
netcraft 2 days ago 1 reply      
does anyone know how quickly the official docker containers will be out? https://hub.docker.com/_/redis/

Could build it myself but if the official image will be out in a day or so id rather wait I think.

Panama Papers source issues statement icij.org
362 points by p0ppe  2 days ago   136 comments top 16
jobu 1 day ago 4 replies      
There's a lot to parse in that statement, but the call for whistleblower protections seems to be the single most important (and achievable) item. Most people are aware of Snowden and Manning, but it really surprises me that these haven't been reported more:

"Bradley Birkenfeld was awarded millions for his information concerning Swiss bank UBSand was still given a prison sentence by the Justice Department. Antoine Deltour is presently on trial for providing journalists with information about how Luxembourg granted secret sweetheart tax deals to multi-national corporations, effectively stealing billions in tax revenues from its neighbour countries."

Law enforcement agencies don't have the resources or knowledge to go after much of the corruption and wrongdoing inside governments and large companies. If insiders with integrity don't have a safe way of stepping forward there will never be a way to keep wealthy/powerful/connected individuals from abusing the system.

1024core 1 day ago 1 reply      
The call for whistleblower protection is important, but it'll never happen. The powers-that-be don't want to encourage whistleblowers.

What may change their mind is if all the data were made public. Since whistleblowers have not much protections, their only protection right now is to release everything on the 'net, anonymously. Now, clearly this is not a good idea, as in many cases there will be collateral damage. But what is the alternative? Once the governments see that such collateral damage is the only alternative, they will be force to enact meaningful whistleblower protection.

mgraupner 1 day ago 6 replies      
But most of all, the legal profession has failed. Democratic governance depends upon responsible individuals throughout the entire system who understand and uphold the law, not who understand and exploit it.

Sigh. One of the most puzzling questions I have to deal with in my mind. Why is there so little moral left in this world?

voodootrucker 1 day ago 1 reply      
"issues involving taxation and imbalances of power have led to revolutions in ages past."

Given that all the checks and balances have failed, I don't see very few other options.

raverbashing 1 day ago 1 reply      
This statement is very interesting, it is very well written and researched, and it uses less common words. I wonder if there was some level of editing/embellishment done to it.

Might as well be a non-native speaker. It also seems to me that the author might have had some personal reasons to target MF

But to be honest, I don't think Income Inequality is one of the most defining issues of our time, through human history, inequality, not only monetary, but cultural and intellectual has usually been higher.

sohcahtoa 1 day ago 5 replies      
And yet... why no American names in the leaks? Funny that.
tobltobs 1 day ago 0 replies      
There will never be better protection for whistleblowers. That is just not in the interest of those who make the laws nowadays. What will come are new laws which will make the publishing of whistleblower material illegal. The EU for example is currently working on a law for better protection against theft of trade secrets. Which such a law in place no newspaper would risk anymore to publish something like the panama papers.
jayjay1010 1 day ago 1 reply      
Question for John Doe. Where's the missing American names and other retracted elements. Further given his naive calls for governments to do something to fix the problem seems naive to the extreme since they were the ones who created the 'problem' or backdoors in the first place
return0 1 day ago 0 replies      
I think their upcoming release is more interesting:

The International Consortium of Investigative Journalists will release on May 9 a searchable database with information on more than 200,000 offshore entities that are part of the Panama Papers investigation.

The data [...] includes information about companies, trusts, foundations and funds incorporated in 21 tax havens, from Hong Kong to Nevada in the United States. It links to people in more than 200 countries and territories.

yxlx 1 day ago 0 replies      
The tone of the statement is very anonymous in nature. Cold and factual. I wonder if this is how its author normally express theirself or if they've written in this style in order to defeat authorship attribution.
glasz 1 day ago 0 replies      
more and more this feels like a big smoke screen to keep sheep walking in circles. german publications are losing readers in droves. they desperately need to capitalize on a good story.

there's absolutely no real news here. but suddenly "democracys checks and balances have all failed".

come on.

curuinor 1 day ago 1 reply      
This seems to be long enough for a statistical attack to narrow things down to ~100 people.
homero 1 day ago 0 replies      
Why can't I donate bitcoin
marmaduke 1 day ago 0 replies      
I am impressed at how well it's written.
joesmo 1 day ago 0 replies      
The revolution might be digitized, but if it does indeed happen, I'm afraid it will also be bloody as hell, as almost all revolutions are. I suppose some things never change.
nxzero 1 day ago 1 reply      
>> "Source known only as John Doe says income inequality "one of the defining issues of our time" and calls on governments to address it."

Calling on any government to create change, not it citizens, is a mistake - especially on a topic like this.

Tell HN: Apply HN apology and revision
638 points by dang  1 day ago   200 comments top 60
mdip 1 day ago 0 replies      
It's nice to see an organization behave like "a decent human being" from time to time.

This post is a shining example of exactly what people want to hear when they perceive a company/organization has failed (especially when it's apparent to the company/organization). It's direct/to the point, doesn't mince words, doesn't attempt to twist it into something it isn't and instantly, in my case, raised the level of respect that I have for YC[1]. Personally, I didn't fully understand the controversy and didn't feel it was as big of a deal as it was bubbling up to be, but everyone has their opinion. Had I been on the "yes, you fucked up badly" side, though, this would have been a response I would have never expected and been pleased to see.

I know there are reasons that organizations don't offer this level of candor. Many of them are the same reasons people choose not to apologise/own up for their own mistakes. The only valid reason is the one that comes from the legal department: Outright admitting a mistake opens one up to possible liabilities and an easy win from a plaintiff in court. In the especially litigious United States, this could be a "death blow" kind of risk. When it's not, I wouldn't want to be the guy in charge of weighing the "goodwill" benefit from handling an apology correctly against the costs of litigation (I'd prefer to attempt blind-folded archery through the wake of a 747). But I deeply wish organizations could behave more like individuals and handle an apology properly: Admit clearly you've screwed up, state the cause and corrections to prevent it in the future, and possibly provide something as a show of good faith that those actions are being followed. I found seek out and find a way to do business with companies that behaved like that.

[1] Which is funny to say. Frankly, I'd have expected a response like this from YC because they've tended to behave in an admirable way.

[2] All of which are terrible ideas and require one to only get over one's ego. Be quick to apologise and quick to forgive is my rule. Just because malice wasn't intended (which is practically always the case), doesn't mean things couldn't have been handled better and people weren't hurt just the same.

postscapes1 1 day ago 5 replies      
I might be in the minority opinion here, but I think Maciej is getting off light on this one.

I am a happy paying subscriber to Pinboard and enjoy his writing as much as anyone here, but Pinboard can barely be considered a startup at this point in time (running since 2009), and he seems like he has not really been interested in adding much new since then (for good reason..)

The original posting stated "It will be like a lighter version of YC for idea and prototype stage companies" - Which doesn't fit Pinboard at all.

I think the original response to the voting, etc was handled very professionally by YC, and Maciej should spend more time writing and less time stewing up trouble.

keithflower 1 day ago 4 replies      
Note to tech world: THIS is how you take responsibility, make things right when mistakes happen, and look out for the communities we live in.
danieltillett 1 day ago 1 reply      
My specific suggestions on how to improve the Apply HN process.

1. Allow more than 2000 characters for the application description. I dont think we need to make this endless - something like 5000 characters should work.

2. Increase the contrast of the text in the description - as a (slightly) older person reading large blocks of light grey text on a light grey background is hard going.

3. Set clear guidelines on what we are supposed to be judging the applications on. Is it what we would most like to see funded, or is it what we think is the best fit for YC?

4. Once a shortlist is selected then let the applicants update the description in light of the discussion they have had. In my case I learned from the questions asked that many people missed what the market was for my idea and dismissed it on that basis. I really needed to go back into the original application and update the description to make this clearer.

5. YC should not limit themselves to the winners of the process. If they see an application they think is a great fit for YC then just interview the applicants. YC is getting a different category of applicants in both cases. In my case I put in an Apply HN application yet I had no interest in applying for a standard YCF or the YC program. Edit. I should add here that I am probably not a good fit for YC, but I am sure some of the other applicants that did not win were.

6. We need some better way of deciding on what is more important - what appeals to the HN community, or what has wider appeal. The HN community is a great resource, but it is not necessarily the best market for a startup. Many of the applications were voted up on what HNers wanted, not on what we thought the wider world wants.

colinbartlett 1 day ago 0 replies      
What a compassionate way to turn this into something good! Thank you to all parties.

If anyone wants to join me in also donating to the San Francisco Coalition on Homelessness, here is the donation form linked on their site:


(You have to actually click the "Take Action" link in the header: http://www.cohsf.org/)

karmacondon 1 day ago 2 replies      
I'm starting to lose faith in the concept of online voting. You get Pinboard, Boaty McBoatface and white house petitions about the Death Star. Maybe it's how easily links can be shared to reach a wider audience. Or just something about the nature of using the internet to vote. For some reason, nonsense seems to be more likely when people make decisions over http.

There's a fine line between "wisdom of the crowds" and "American Idol for startups". I'm not sure exactly where Apply HN falls. It doesn't seem like any individual investment will make or break YC, and this is an interesting idea for an experiment. But I don't come away from this feeling upbeat about the democratic process.

neurotech1 1 day ago 1 reply      
The ability to say "I F'd up" is an important skill of leadership. Capt. Kohei Asoh [0] crashed a Japan Airlines DC-8 into the San Francisco Bay after a miscalculating the final approach. Nobody was injured and the aircraft was repaired. It became famously known among pilots as the "Asoh Defense"

[0] https://en.wikipedia.org/wiki/Japan_Airlines_Flight_2#The_.2...

hsod 1 day ago 2 replies      
It's weird. I read that thread and gave it a lot of thought over the last couple of days, and I was just deciding that I respected your choice (even if it wasn't necessarily one I would have made myself). Then this unqualified mea culpa shows up and I don't know what to think.

In the world of PR, there's a very strong bias towards appeasement. If a controversy gets big enough companies tend to just 'give in.' But these victories are hollow ones as the true rightness or wrongness of the controversial actions become irrelevant. It's impossible to know whether a corporate/organizational apology is genuine or if the stakeholders are simply appeasing the crowd.

My gut says this phenomenon has become more powerful in the social media era as consumer voices are more easily amplified.

levemi 1 day ago 1 reply      
For what it's worth I only voted for pinboard because I like what they're doing. I didn't vote for any of the other startups. I didn't see the promotional tweet and I had hoped pinboard would become a YC company. I'm disappointed all around and while I appreciate that the money will help people at the charity I'd be lying if I didn't say that I'm disappointed Maciej didn't go and join the YC family with pinboard.
rdl 1 day ago 0 replies      
Wow, my respect for everyone involved is higher now than before this incident. Congratulations!
tptacek 1 day ago 1 reply      
You're a good man, Charlie Brown.
simonebrunozzi 1 day ago 0 replies      
Hey, well said.

Fuck-ups happen. The biggest difference is how you deal with it.

gus_massa 17 hours ago 0 replies      
Just three recommendations. (I think I have read them in other threads, but I want to insist on them.)

* Say that the HN community will select 2 project for an interview for the YCF program, so it's completely clear that YC has the final word, veto power and whatever additional conditions seam necessary.

* Keep the last vote open for at least 24 hours, as it was in the extended period. I live Argentina, so I have a an hour similar to USA, but it would be nice that the people outside north/south America have time to vote.

* (More difficult) Enable a "hide" option for the ApplyHN threads. There were more than +250 applications and some of them were good and some of them were "obviously" bad. I'd like to filter them from the random order (without flagging them), so I can make my own shortlist of 30-50 applications to read them more carefully and upvote a few more.

alva 1 day ago 0 replies      
Great response dang and big respect to Maciej for directing the 20k towards charity. A good resolution.
minimaxir 1 day ago 2 replies      
While this is the best outcome for all parties, I hope that future experiments have more clearly-defined rules. The outcome shows that brigading is a favorable strategy, and it may set a bad precedent in the long run. (Such as with a certain other startup voting site where vote brigading is expected because everyone else is doing it
personjerry 1 day ago 0 replies      
Nice. I like how YC "moves fast" and introduces new experiments pretty often, but solves the "break things" issue gracefully and with good communication.
cperciva 1 day ago 6 replies      
If any of you have suggestions for how to do better, I'd like to hear them.

1. Make this "HN selects startups which get invited for YC interviews", i.e., a feeder into the existing system. Essentially, use HN to supplement the network of YC alumni who help out with the application-filtering.

2. Since YC will explicitly still have the final say, open this to both Fellowships and YC Core applications.

3. Have a standardized form. Or possibly even a "make this application public" checkbox on the regular YC application form.

4. Use a more sophisticated voting system. I think a "which of these two looks better" combined with a form of Elo rating could work quite well.

EDIT: To the people voting this comment up: Do you agree with all four of those suggestions, just some of them, or are you voting it up because you like the fact that I'm offering ideas despite thinking that these four are all bad ones? As Dan has said a few times recently, discussion is more useful than votes. :-)

Kiro 1 day ago 0 replies      
I can't support Maciej when he's posting this kind of stuff:



cprecioso 1 day ago 0 replies      
While I was rooting for Maciej, I understood why Y Combinator didn't want to fund him, and I don't understand this decision at all. Maciej applied to YCF, not to just get the money, even if that's all he wanted. He should have either took the money and the mentorship, or nothing at all.
HappyTypist 1 day ago 0 replies      
I think what YC has done was reasonable. It is true that you changed the rules, but Apply HN is like a MVP. Making things up as you go alone is part of the ethos of startups, and what matters more is if it's done in good faith.
OoTheNigerian 1 day ago 0 replies      
Since I commented on the previous ones, here is my take on this.

Dang and YC have done well and played their part. Maciej has not.

I am with @nickpsecurity in stating that Maciej is misappropriating the money invested in his company. Of course the argument can be made that

1. The money did not get to him at all.

2. Too much bad blood has made participation untenable.

Irrespective of the above, the spirit and intention of those that voted him (I did not) and those that insisted he be given the money (I did) for for Maciej & Pinboard to participate in YC so we can see what happens and how it will affect the product. .

If we are looking for who YC should make charitable donations to, we would mention that.

The aim of this scheme by YC was to engage and bring the community together and it has brought a lot to awkwardness. This is mainly on Maciej and doesn't look good.

To Dang and co, keep up the experiments. Not everyone will turn out as planned.

markbao 1 day ago 0 replies      
It sucks that the outcome of the first Apply HN became complicated, but the resolution makes me confident that the people behind this are gracious and understanding which is more important than that mistakes were made this time around. Thank you.
Gratsby 1 day ago 0 replies      
I thought it was pretty clear from the beginning.

But I have to say it's pretty cool to see this response. Shouldering responsibility and admitting fault is something that most people avoid and most businesses don't even think about.

forgottenacc56 1 day ago 8 replies      
This is a cryptic post, what, in direct terms, happened?
josh_carterPDX 1 day ago 0 replies      
To Daniel and Kevin,

You are both awesome people who are genuinely interested in doing something different to help give new ideas a nudge forward. It sucks that this blew up, but I think the original intent still rings true and I thank you both for giving us a chance to be a part of it.

nickpsecurity 1 day ago 3 replies      
Excellent response, dang, by you and YC.

Alternately, what Maciej did with the money is more irritating now given point of competition and his comments in it. I was more interested to see what value in business he could create with it given Pinboard success. (rolls eyes)

acconrad 1 day ago 0 replies      
Cheers for doing what's right, double cheers for the donation!
rmason 1 day ago 1 reply      
How about letting the community nominate companies for consideration? I can think of a few bootstrapped companies in Michigan that are deserving. Whether HN would agree I don't know.

Rules would be that you couldn't have any equity in said company or have ever received any compensation from them.

Could the HN community vet them as well as you do?

soneca 1 day ago 0 replies      
I understood it as a game you could change the rules until you posted the poll, already vetting some uneligible applications and keeping Pinboard, and stating that the two most upvoted would pass.Then you messed up because you decided Pinboard was not elegible anymore. Personally I find this final decision and mea culpa correct and honorable. Congrats.

My suggestion: Next time ask a private application for YCF with the sole purpose of considering an application eligible on whatever (transparent) criteria you choose. And dont take it lightly. Be 100% sure that you would give the 20k and mentoring to any of the eligible applicants.

Then create detailed rules and guidelines that allow removing an elegible applicant from the poll and publish it.

Then, only then, publish the poll and state that the 2 most upvoted will be selected.

_wmd 1 day ago 2 replies      
If you insist on drawing continuous attention to meta drama (this is something very new to HN since PG stepped back), then please at least document exactly what it is you're apologizing for rather than making some vague remarks then linking to 2 huge threads full of drama. I really tried to figure out what this was all about, but after 5 minutes I've given up.

This is really crap content, please cut it out

zorpner 1 day ago 0 replies      
Well done -- excellent resolution.
rorykoehler 1 day ago 0 replies      
How to make the process better:

- give a 2 week window for applications to go up. Only publish the posts all at once, once the window shuts

- give every user one vote and one vote only so there is a cost to the user in voting

felixgallo 1 day ago 0 replies      
Much respect Dan. And classy, great way to handle it to both you and Maciej.
vs2370 1 day ago 0 replies      
well handled. The positive side of all this is cohsf got 20k donation. So it wasn't that bad after all.
austinhutch 1 day ago 0 replies      
I was extremely disappointed reading the thread yesterday and the overall handling of the situation on the side of YC. This apology and outcome is about as good they come though.
emirozer 1 day ago 0 replies      
I actually really like this outcome of 20k going to charity :)
staunch 1 day ago 0 replies      
Yay. Well done. Now please don't let this rocky start ruin such a uniquely promising idea. This is YC's chance to implement the first true diversity program in SV. https://news.ycombinator.com/item?id=11634217
waddabadoo 4 hours ago 0 replies      
Tough crowd here.
a_small_island 1 day ago 1 reply      
I almost posted an 'ASK HN' on this topic but didn't want to stir the pot. Since it was an online contest, was YC legally obligated to the 'official rules' as it were, and Pinboard had legal recourse? Did that factor into this ultimate decision of capitulation?
andy_ppp 1 day ago 0 replies      
Super impressive to see things resolved this way! Well done.
beatpanda 1 day ago 0 replies      
Wow, $20k to the Coalition is a great choice. That's fantastic news.
kevinwang 1 day ago 0 replies      
tdicola 1 day ago 3 replies      
> A note to HN users: The intention behind Apply HN was to do something new to excite and interest the community and engage it with YC in an interesting way. That did happen, but it pains me that it also partly turned into the opposite. If any of you have suggestions for how to do better, I'd like to hear them.

How about publishing rules for a contest and sticking to them, or canceling the contest entirely if it's not going the way you expected or desired? If there was a risk that some founder you didn't like would be picked the rules should have been clear about how to prevent that (through interview, etc.). And if you realize the rules didn't cover this contingency then scrap it entirely and start over with new rules.

Frankly I'm surprised that this is being pushed back as some failing of the HN community. This is clearly a failing of the contest creators.

chris_wot 1 day ago 0 replies      
dang, good on you. I hope it's not too embarrassing if I publicly say this, but I'm more than ever impressed by your level headed judgement and decision making, but even more so that when you realise you made a mistake you openly say so, then take measures to make it right.

It's very hard to do this, and it's my firm belief that only those of good character have the strength and fortitude to do so!

Maciej, you're a gracious guy and the manner in which you handled this was also exemplary.

As an aside: dang has one of the toughest and most thankless jobs you can imagine: moderating HN and ensuring that trolls, unstable people and those with hurt feelings are fairly dealt with and at times corrected. I am one of those people who dang has had to quietly speak with after I emailed HN, and his gracious, open and firm communication speaks volumes, and is one of the reasons why HN is the best and most interesting forums on the Internet.

koolba 1 day ago 0 replies      
This apology is definitely worth more than the $20K. Well spent and nice turnaround.
wodenokoto 11 hours ago 0 replies      
Is there a tl;dr version of all this?
smoyer 1 day ago 1 reply      
There is too much grief being felt on both sides of the table for a measly $20k. I hope everyone's happy with he outcome but let's avoid this kind of pain in the future.
danieltillett 1 day ago 1 reply      
Dan are we going to have a specific postmortem thread on this experiment? I don't want to post a lot of specific suggestions in this thread as they would be rather off topic.
forgottenpass 1 day ago 1 reply      
If any of you have suggestions for how to do better, I'd like to hear them.

Don't open up a selection process if you're not going to accept results that aren't already of the sort that your insider selection process would have picked anyway.

In another thread you called Pinboard your "Boaty McBoatface scenario." The thing worth understanding is that Boaty was the best possible outcome of that poll. If they had wanted to keep to status quo of boring and vaguely-majestic names they should have never opened the process up. Boaty was not a failure, Mountain Dew's poll that selected "Hitler did nothing wrong" was.

Unless you're ready to paint the giant cartoon eyes on the bow of your ship, you don't actually want outsider ideas. So why are you asking for them?

yarou 1 day ago 0 replies      
Good on dang and whomever else was involved in this decision.

@dang - I know you sometimes have to make tough decisions where all parties involved end up unhappy with the end result. But I'm sure that the vast majority of the community appreciates what you do. At the very least I do.

It's not said very often, so I wanted to rectify that.

aaronbrethorst 1 day ago 0 replies      
Props to both of you.
curiousgal 1 day ago 0 replies      
Can someone eli5 who this Maciej is?
forrestthewoods 1 day ago 1 reply      
I can't actually find what Maciej submitted, why it was initially rejected, and why it was deemed within the rules. It's kinda hard to scroll through hundreds of posts on a small mobile screen. Can someone summarize those details?
solve 1 day ago 0 replies      
Of all the times I've seen Internet votes made to have a real-world value, the bad parts of this contest were among the mildest of potentially bad outcomes there could be. Went better than expected honestly.
borski 1 day ago 0 replies      
This was precisely the correct resolution. Thanks 'dang and 'kevin for doing the right thing, and understanding the issue at hand. Sorry if we clashed a bit on the other thread, but I wanted to ensure you understood, clearly, how it was misread by so many, even if it didn't occur to you when you wrote it. That happens all too often in anyone's writing. :)


krmbzds 1 day ago 0 replies      
Pinboard is the best.
cmech 19 hours ago 0 replies      
I think you're missing the greater point here. From the original announcement a few things were quite clear. You welcomed a different perspective [0] and those who wouldn't want to go through the normal interview process [0], people was supposed to be nice [1], bias to be avoided [2] and that HN would decide. With this apology you've largely satisfied HN, but left everyone else wondering if they can ever trust an YC announcement again.

The things that was different about him was called out as faults (participating on his own terms, not knowing rules, making you uncomfortable), he was forced into the normal interview process (which you both said wouldn't happen and explicitly tried to attract people who wouldn't want to, then using that as an excuse to reject him), the "be nice" rules wasn't enforced on the many comments attacking his character as a result of this process and bias wasn't avoided.

It's one thing to change the rules, another not to be true to the premise of the experiment. How could someone now in good faith recommend a YC initiative to someone who doesn't fit the mould, when you can't even make it work with "one of your own"?

[0] "Hacker News users have many diverse perspectives on technology and business. Perhaps if HN picked startups, it would pick differently than YC. Maybe different startups would be motivated to apply, if they knew that the interviewing and deciding would be done by the HN community." https://news.ycombinator.com/item?id=11440627

[1] "Anybody who applies to HN in public this way is putting both themselves and their baby in a super vulnerable position. We're going to rise to the occasion by being not only civil, but nice." https://news.ycombinator.com/item?id=11440627

[2] "Its easy to form some really bad habits when you sit in a position of power to judge the potential of a person, a team, an idea and their executionbelieving that you know better and focusing your time on finding weakness." https://news.ycombinator.com/item?id=11440843

cmech 8 hours ago 1 reply      
Not only do you try to fix the competition and then hide the fact, you're now trying to do the same with legitimate questions. This isn't responsibility nor transparency and surprisingly unbecoming of a organisation like YCombinator that supposedly value merit.
hueving 1 day ago 0 replies      
Pinboard is an old company and not a startup in the sense of the SV term. It's just a small business that manipulated the community. I'm a bit disappointed that it received any money. :(

Democracy has to be more than 2 wolves and sheep deciding what's for dinner.

genericpseudo 1 day ago 0 replies      
Sadly, regrettably, insufficient. You've just, finally, made good on the original terms; you haven't dealt with the insults thrown at Maciej's character. I would hope that you would have realized that was necessary too.

> If any of you have suggestions for how to do better, I'd like to hear them.

Y Combinator needs an impartial, independent ombudsman dedicated to tackling implicit and explicit bias both within the YC application process and, especially, on Hacker News itself.

GiveDirectly Planning to Give $30M in Basic Income to East Africa givedirectly.org
272 points by deegles  1 day ago   208 comments top 23
mwambua 1 day ago 12 replies      
I'm a Kenyan and I can tell you that this will do more harm than good. NGO money has been distorting our economy for years now... giving people livelihoods they can't sustain, making job opportunities that create no value and making our government vestigial. The poor people that you're supporting elect the same corrupt politicians every year because they have no idea that the government is supposed to be supporting them.

I pay 30% tax as a lower-middle income worker but I see very little of this used productively. The majority of voters don't pay taxes and rely on aid for everything from farm inputs to school fees. As a tax-payer who understands what taxes should do for me... I live with very poor infrastructure and amenities while the government squanders away money. If developed countries keep propping up our poor majority, they'll never learn to find innovative solutions to their problems and elect governments that will create sustainable wealth.

- Frustated Kenyan

troycarlson 1 day ago 8 replies      
Recently fascinated by the thought of piping donations directly to poor populations and allowing them to use it as they wish. I watched the documentary Poverty, Inc. [0] regarding the "poverty industry" and how endless foreign aid in these countries is killing the economy. If Toms shoes dumps 1,000 pairs of shoes in the city center every week, how the hell would a cobbler stay in business. Or if the U.S. pumps an endless supply of rice into Haiti, how could Haitian rice farmers sell their inventory?

Instead, give these people the cash to purchase locally sourced goods or open a business of their own. Economies don't spring up out of a pile of free rice.

[0] http://www.povertyinc.org/

scottrogowski 1 day ago 10 replies      
While I find UBI to be a fascinating idea worthy of consideration, I vehemently disagree with running this experiment in East Africa. The proposal seems to either be blind-to or minimizing the potential for unintended consequences. Decades of trying to alleviate poverty in Africa have taught us, more than anything else, that the road to hell is paved with good intentions. Giving food tends to act as dumping and weaken the agricultural industry. Providing foreign aid tends to empower corrupt governments. These consequences only became clear after years of hindsight. Who knows what unintended side effects of UBI will be?

If we are going to try UBI, a fundamentally Western idea, its only right that it should be tried in the West. Im not saying it wont work - just that we dont really know what will happen. East Africa has suffered enough from our neo-colonial experiments.

kogepathic 1 day ago 6 replies      
Interesting concept. I'll be interested to see if it works.

I worked with a company building AC mini grids in East Africa. They connected around 130 households in each village to a 220V grid.

The expectation was that people would use the electricity for basic needs (e.g. light, cell phone charging) but also for productive uses (e.g. grinding grain to flour, air compressors, saws, etc).

The expectation was that as people entered the second phase of energy use (after basic necessities) their consumption would increase as their income increased.

What this company found though, was that very few customers followed this pattern. If people bought anything that increased their consumption, it was almost always a TV. This obviously didn't improve their income, so often their use would actually decrease because they can't afford the increased consumption.

Last I heard, they were having serious problems getting people in these villages to use more power.

Through some pricing campaigns, they discovered that people's energy use was almost directly correlated to the price. If you made the price similar to other well developed countries ($0.10-$0.30 kWh) people would use lots of energy, but the amount they paid was still relatively small. If you raise the energy price, people use much less, but end up paying almost the same amount per month. So if it costs $0.25 per kWh people will use 4kWh, but if it costs $1 per kWh they'll only use 1 kWh.

Since their model included batteries to cover the night load, they couldn't afford to sell the energy this inexpensively. Hence their wish to increase day time energy use among the villagers through equipment use (e.g. welding, grinding).

My expectation from this program: some small percentage of people will use the money to invest in tools and means to improve their economic situation. The vast majority though will waste the money on things which benefit them in the short term, but provide no long term economic benefit to them (e.g. better cell phone, television, or maybe a solar home system)

Also from hearing the horror stories of doing business in East Africa (draconian regulations + corruption) I'm sceptical if anywhere close to 90% can make it into the hands of normal people. Hopefully this isn't the reality...

Not sure why I'm getting down votes. I've worked with a company actually working in East Africa and I don't see people using the opportunity of energy to improve their economic situation. After basic necessities they seem to use it for leisure.

agrona 1 day ago 0 replies      
GiveDirectly's previous work involves finding the poorest communities in the world and targeting the poorest members with direct cash transfers. I was turned on to them a few years ago by GiveWell (unrelated), which attempts to evaluate charities based on cost effectiveness and capacity for more funds.

In addition to the GiveWell reports, GiveDirectly also document and publish their work and research. It's exciting to see the impact that they are having (and how minimal the systemic abuse of the money is):



deegles 1 day ago 0 replies      
TL;DR:"With your help, we will run a long-term, universal basic income and study it rigorously to find out.

We think the whole thing will cost roughly $30M, of which around 90% of the funds will go directly to very poor households."

Aelinsaar 1 day ago 0 replies      
I like this. Everything I've learned about charity seems to indicate that in most cases, giving people who need money, money... is the way to go. Being able to examine the impact of a basic income in my view, is an additional benefit.
fche 1 day ago 1 reply      
"With this pilot, we want to provide a true test of a universal basic income."

... except for the part where taxes are collected from the population to pay for said universal basic income.

home_boi 1 day ago 1 reply      
The study won't be valid unless they wait at least 2 generations after UBI is implemented.

People born before UBI grew up with the scarcity on their minds, shaping the way they act with UBI.

People born right after UBI is implemented will be biased by parents who grew up wirh scarcity, like the children of American immigrants.

The study can only be valid when the parents of the person to be observed has UBI for their entire life.

geekfactor 1 day ago 1 reply      
FYI the April 13 show of the Freakonomics Podcast takes on the question "Is the World Ready for a Guaranteed Basic Income?" and mentions some of the prior studies noted in this article.

[1] http://freakonomics.com/podcast/mincome/

brianbreslin 1 day ago 1 reply      
Can someone point me to any unbiased research on the micro/macro effects of basic income vs raising minimum wage? I've been struggling to find non-hyperbolic politically laden information on the subject.
safeharbourio 1 day ago 0 replies      
Let us jump in with our opinion (We are a startup from Kenya),This has the potential to do great good, and at the same time, great harm. As someone from this region, my first thought like mwambua, is the instant abuse by corrupt individuals as a get rich quickly mechanism, and thats my biggest worry, will this project be able to ensure that needy people actually get this, or who decides who gets it? More or less this is the only thing i would have a problem with, as integrity issues down here have and continue to ruin everything, if this program can by some means manage to rise beyond that, then for sure, this will be great, and i would definitely look forward to the results.This is my take, if you can get it done transparently and with proper selection (weed out perennial NGO dependents),this may very well be the best social/political science experiment out there.As a tangential observation, one of the reasons corrupt politicians get power down here, is that they have the means(through dubious sources, usually stolen from the taxman) to dole out cash bribes/handouts to young otherwise destitute/jobless guys. Should this experiment change the economic circumstances on the ground, the politics will also have to change, which may be quite frightening for politicians, have you guys thought of what sort of reaction you will get from influential individuals with a stake in the status quo?That said, viva la basic income, bring it ;), we are all for it, after all, whats the worst that could happen?

Should some of the concerns we've raise be addressed, this will definately sidestep the problems mentioned in haiti and other areas on this thread.

dragonwriter 1 day ago 0 replies      
FWIW, this isn't really testing a what "Basic Income" is usually defined as in any area, since the benefit will be given to people in the target area when it starts, and follow them if they leave (and not be given to new residents of the target area that arrive during the trial). And also may include different levels in the target area. So its an experiment about giving free money to individuals, in an arrangement substantially different than a Basic Income.
larrik 1 day ago 0 replies      
"It is provided to everyone, regardless of need, forever."

How long will the experiment last? I feel like the recipients knowing they are guaranteed a payout for at least a decade will result in very different behavior than only knowing you have it this year and maybe next year, even if you do for it a decade.

jrbapna 1 day ago 0 replies      
I'm looking forward to the day when the other half of the world gets "on the grid" so to speak. The rise of other economies will greatly benefit humanity as a whole.

Great experiment. Very excited to see where this goes.

thedevil 1 day ago 1 reply      
While it's noble to give away $30M, it won't necessarily give strong support for universal basic income. In order to have an effective experiment, you need to also tax those people $30M and redistribute it. That's missing from their checklist.

Of course free money is better than no free money! That's all this will prove and few will debate that. The real questions are: 1) is it worth the cost, and 2) is it better than alternatives?

I would personally donate to a large-scale A/B experiment where you give one group cash and the other group food, housing and education worth an equal amount and see which works better.

I worry that the recipients are going to be better off (because you just gave them $30M) and then everyone is going to want to implement a huge, risky (and fundamentally different) program in the US.

Note to avoid excessive downvotes: I like the idea of UBI in theory and I will support it one day but I think we're not there yet. I think we need more automation and more overall wealth before it makes economic sense - before benefits > costs.

mathattack 1 day ago 1 reply      
Seems like a great experiment. I think this will do wonders for both the people, and our knowledge of how to best provide aid.

2 things I'm interested in seeing:

1 - Does it make a difference if it's USD sent over, versus local currency? (Is it a cash shortage, or a foreign cash shortage?)

2 - How much inflation will occur?

jijji 1 day ago 0 replies      
sounds like another 319 scam waiting to happen. bye bye money. all you have to do is read the thousands of stories about different governments or NGO's attempting to "help", or "fund", or "develop" projects in that part of the world and all the people have done was squander the money. History repeats itself if you don't learn from its mistakes. I think a better place to flush money down the toilet is to use an actual toilet, at least you know where its going.
j4kp07 1 day ago 0 replies      
LMAO....they need donations to "test" Basic Income. That speaks volumes people.
aab0 1 day ago 0 replies      
I'm very interested in the outcome of this trial, and have donated $800 recently (using Bitcoin).
tn13 1 day ago 1 reply      
Letting markets function has helped poor countries like India, China and even North Korea. Making decisions on behalf of poor people has hurt poor even in the richest countries like USA. Common sense to me.

There is always an issue of law and order though. If theft is a common problem then this may not work.

EGreg 1 day ago 1 reply      
Basic income is inevitable.

Wages have been lower relative to inflation, real estate etc because demand for human labor has been dropping. Consumer goods come down in price but real estate, gold and other limited resources are a reference for how low demand for human labor is falling.

Technology enabled outsoucing, and automation, to erode the demand for human labor. Luddites were 150 years too early. The capabilities of computers are only growing.

The next will be self driving cars, kiosks at mcdonalds and drone delivery. That will put a lot of people out of work.

Relying on wages to trickle money down only works when employers value their employees. These days we are in the intermediate period with a growing unemployed class, part time temp work, two year stints at companies, and stagnant wages for the average profession including developers.

There is no reason not to tax the productivity gains made by corporations by R&D and automation, and redistribute that to everyone. Alaska has been doing unconditional basic income from a tax on using its natural resources.

Most people will be 90% consumers and only 10% producers. Already, most full time jobs are just make-work. Conditional welfare and fulltime jobs make people afraid they'll lose their check if they work on what they are passionate about. That's an infantile mentality that holds many back from living productive lives. They instead pretend to work or pretend to be poor, to keep getting that conditional check. It's time to let the humanity at large grow up, stop wasting time and tap their potential.

awt 1 day ago 5 replies      
While initially in favor of basic income due to the income trap created by the traditional approch, I have come to oppose it due to a better understanding I now have of the nature of money. Consider that basic income works when you think of money in the micro context as a means of exchange for goods and services, but not when you think of it in the macro context as a reflection of the wealth of a society. Simply transferring money from one group to another does not make the second group wealthier.
Introducing TAuth: Why OAuth 2.0 is bad for banking APIs and how we're fixing it teller.io
308 points by AffableSpatula  2 days ago   173 comments top 38
JoshMandel 2 days ago 2 replies      
1. To my mind, the fundamental problem OAuth solves is "letting a user decide" to share data with an app, without making the user responsible for jumping back and forth between the app and her API provider (her bank, in this case). OAuth holds the user's hand through a series of redirects, and the user doesn't have to copy/paste tokens, or remember where she is in the flow, or know what comes next. Does TAuth have a similar capability? The blog post mentions "User Tokens" in passing, but doesn't define or describe them.

2. OAuth 2.0 is published as an RFC from IETF. It may be a bear to read (and yes, it's a framework rather than a protocol!), but the spec is open, easy to find, and carefully edited (https://tools.ietf.org/html/rfc6749). Is TAuth meant as a specification, or a one-off API design? If it's a specification, has there been an attempt to write it down as such?

chatmasta 2 days ago 10 replies      
One big problem with OAuth on mobile apps is this scenario. I've seen this in the wild for non security-critical apps. As far as I can tell, it's not a bug so much as it is a problem with the OAuth protocol and webview permissions:

1) MyLittleApp wants OAuth access to BankOfMars

2) MyLittleApp bundles BankOfMars SDK into MyLittleApp

3) MyLittleApp requests oauth access via SDK

4) SDK opens WebView for user to log into BankOfMars

5) MyLittleApp has full control over the DOM presented to the user since the WebView is technically its own.

6) MyLittleApp extracts the user's password from the DOM of the WebView

7) MyLittleApp disappears and... profit?

andrewstuart2 2 days ago 2 replies      
In my opinion, having worked extensively with OAuth2 (mostly in the form of OIDC) and other modern AuthN/Z protocols, the author of this post does not truly understand OAuth 2, nor have they looked in any appropriate depth into supplements like OIDC or alternatives.

For one, bearer token [1] is only one type of "Access Token" described by the OAuth2 spec [2]. In fact, the OAuth2 spec is very vague on quite a few implementation details (such as how to obtain user info, how to validate an Access Token), which the author seems to just assume are part of the spec, as he does with bearer token. Other parts, like the client/user distinction, and the recommendation for separate validation of clients, the author ignores completely, generating his own (ironically mostly OAuth2-compliant [3]) spec.

> Shared secrets mean no non-repudiation.

Again, not true. Diffie-Hellman provides a great way to come to a shared secret that you can be cryptographically sure (the adversary's advantage is negligible) is shared between you and a single verifiable keyholder.

> Most importantly using JWT tokens make it basically impossible for you to experiment with an API using cURL.

sigh. If only there was a way to write one orthogonal program that can speak HTTP, and in a single cli command send that program's output to another program that can understand the output. Maybe we could call it a pipe. And use this symbol: |. If only.

> OAuth 2.0 is simply a security car crash from a bank's perspective. They have no way to prove that an API transaction is bona fide, exposing them to unlimited liability.

TL;DR: This article, led by comments like this ("unlimited", really?), strikes me as pure marketing (aimed at a naive audience) for a "spec" that probably would not exist had proper due diligence into alternatives, or perhaps some public discussion, occurred. At the very least, inconsistencies (a few of which I've mentioned above) could have been avoided.

[1] https://tools.ietf.org/html/rfc6750 [2] https://tools.ietf.org/html/rfc6749 [3] https://tools.ietf.org/html/rfc6749#section-2.3.2

amluto 2 days ago 3 replies      
Some features that I think a system like this should have:

1. The client (or the device holding the authentication token, or the app, etc) should be able to maintain (on its own storage!) an audit log of all transactions it has authorized, that log should be cryptographically verifiable to be append-only (think blockchain but without all the Bitcoin connotations), and the server should store audit log hashes and verify that they were only ever appended to. And the server should send a non-repudiable confirmation of this back to the client.

Why? If someone compromises the bank or the bank-issued credentials (it seems quite likely that, in at least one implementation, the bank will know the client private keys), the client should be able to give strong evidence that they did not initiate a given transaction by showing (a) their audit log that does not contain that transaction and (b) the server's signature on that audit log.

2. Direct support for non-repudiable signatures on the transactions themselves. Unless I'm misunderstanding what the client certs are doing in this protocol, TAuth seems to give non-repudiation on the session setup but not on the transactions themselves. Did I read it wrong?

3. An obvious place where an HSM fits in.

How does TAuth stack up here?

Also, there's a very strange statement on the website:

> to unimpeachably attribute a request to a given developer. In cryptography this is known as non-repudiation.

Is that actually correct as written or did you mean "to a given user"?

jhuckestein 2 days ago 1 reply      
As much as I love Stevie, teller.io and this demo: Why not both?

OAuth 2 is not "bad" in general, you just need to consider the implications of using it. If you have an API that allows clients to move customers' money or take out loans, you should take additional steps to defend against MITM attacks. For example using client side certificates :)

That said, TAuth looks really good and tidy. Of course the developer may still lose the private key, so in the end you'll always need to additionally monitor API requests for suspicious behaviour.

btilly 2 days ago 0 replies      
The main complaint about OAuth 2.0 seems to be that bearer tokens are a bad idea. Well, you can implement OAuth 2.0 to use any kind of token you want, with any property you want. People do bearer tokens because it is easy, not because it is required.

The secondary complaint seems to be that OAuth 2.0 is a mess. That one I heartily agree with! A few years ago I wound up having to figure out OAuth 2.0 and wrote http://search.cpan.org/~tilly/LWP-Authen-OAuth2-0.07/lib/LWP... as the explanation that I wish I had to start. In the process I figured out why most of the complexity exists, and whose interests the specification serves.

The key point is this: OAuth 2 makes it easy for large service providers to write many APIs that users can securely authorize third party consumers to use on their behalf. Everything good (and bad!) about the specification comes from this fact.

In other words, it serves the need of service providers like Google and Facebook. API consumers use it because we want to access those APIs. And not because it is a good protocol for us. (It most emphatically is a mess for us!)

ForHackernews 2 days ago 1 reply      
> One of the biggest problems with OAuth 2.0 is that it delegates all security concerns to TLS but only the client authenticates the server (via it's SSL certificate), the server does not authenticate the client. This means the server has no way of knowing who is actually sending the request.

That's not just plain not true. In the OAuth2 authorization_code grant, a "confidential" client is REQUIRED to send a client_id and client_secret to authenticate itself to the server.


> If the client type is confidential or the client was issued client credentials (or assigned other authentication requirements), the client MUST authenticate with the authorization server as described in Section 3.2.1.

Now, this doesn't work for "public" clients like a pure-javascript webapp, but that's a separate question.

Count me as pretty dubious of letting some unknown group try to re-implement bank authentication without fully understanding the specification they're trying to fix.

krooj 2 days ago 1 reply      
Their description of the MITM attack is entirely dependent upon how the authorization server validates redirects in the implicit and authorization code grant flows. This is tied to how client registration is performed. So, if you want to ensure that the authorization code or access token is only delivered to a redirect URI that is trusted, that should be part of the policy enforced in your infrastructure... More specifically, you can require domain verification and validation as part of the client registration process, and I would expect that at a minimum when dealing with delegated access to financials.

Another alternative to this would be to perform an OOB flow, wherein the redirect URI is actually hosted on the authorization sever itself and the client can scrape the access token from the Location header.

thallium205 2 days ago 2 replies      
This is unnecessary. Many banks can and will enforce 2-factor authentication with their oauth flow, which sufficiently validates the client and would prevent a MITM attack.

Your whole premise is surrounded by the threat a client browser would not properly validate a server certificate... come on... really?

JDDunn9 2 days ago 2 replies      
I visited the homepage (https://www.teller.io/) and got a warning about the SSL cert being invalid. Kind of ironic. :)
bgidley 2 days ago 3 replies      
This is unlikely to work - developers in general can't cope with managing SSL certificates. They won't know what to do with them or handle them securely.

You need full integrity verification, with a secure store and whitebox crypto keys to make such a scheme secure.

pkulak 2 days ago 0 replies      
Problem one exists because, apparently, MITM is a problem with TLS because it's possible for bogus certificates to get through? Well... I guess. But then that's a TLS problem. And your entire banking website is served through TLS. So, if it really is an issue, then solving it just for auth is like putting an Abus padlock on a screen door.

Problem two bemoans the bearer token in Oauth 2. Yes, it's not as secure as OAuth 1, but it's also far simpler. But you don't have to use bearer tokens; you are free to use MAC tokens instead. Why reinvent the wheel?

wyattjoh 2 days ago 0 replies      
I think my biggest bug here is that as far as I understand this flow, it essentially says that a given certificate that is generated and signed by a third party (Teller in this case) would be expected to bundle this private certificate with the application. Isn't it possible to extract the certificate from the app bundle after the fact? Or am I missing something here...
zaroth 2 days ago 1 reply      
The premise for adding client certificates is a MITM made possible because careless app developers will disable server certificate validation.

So, how exactly does adding a client certificate solve that problem? If server certificate validation is disabled on the client, the MITM can still accept the client certificate and substitute their own.

The difference is that in this case the attacker will gain access to the API but the client will not, unless they are being actively MITM. If the client tries to access the API outside the MITM their client cert will be rejected as invalid.

arnarbi 2 days ago 0 replies      
Client certs are still a bit of a pain. There is already an IETF spec in the works, called Token Binding, on how to bind tokens to key pairs that clients maintain, and create on demand.



It's already implemented in Chrome.

0x0 2 days ago 4 replies      
I thought client certificates were being phased out, didn't Chrome just remove the <keygen> html tag?
guelo 2 days ago 0 replies      
> The EU is forcing all European banks to expose account APIs

I'm so jealous!

EGreg 2 days ago 1 reply      
Actually oAuth 1.0 is less secure than oAuth 2.0 because it engages in security theater. It doesn't even require https and as a result any man in the middle can eavesdrop on the requests. And if the token is leaked, it's game over.
imaginenore 2 days ago 2 replies      
Relying on SMS for bank security has always seemed crazy to me. It's not secure. Didn't Telegram creator just got hacked by the Russian mobile provider that sent an SMS to itself?
educar 2 days ago 0 replies      
One of the things about OAuth is that the user needs to check the website url where he is giving his credentials. Amusingly, many mobile apps seems to forget this important bit. The redirect me to a web ui inside the app itself and expect me to enter my password inside the app. I guess they thought this was a better user experience than handing over control to the browser :/
cakoose 2 days ago 0 replies      
Two things:1. Why not just add client-side certificates to an OAuth-based API?2. Client certificates do not prevent an attacker from pretending to the be server.

Let's say your API server followed the standard OAuth 2.0 protocol except required client-side certificates? Would that be as secure as TAuth?

If so, then the OAuth 2.0 option has the advantage of being well-supported by existing libraries and well-understood from a security perspective. It's less likely that a previously-unknown issue with OAuth 2.0 will crop up and force everyone to scramble for a fix.

And while client certificates prevent an attacker from forging client requests (i.e. tricking the API server) an attacker can still trick the client. An attacker capable of MITM'ing server-cert-only HTTPS can also trick TAuth clients into sending it's banking API requests the attacker's servers. It can respond to those requests with whatever it wants.

To summon the activation energy to adopt (or switch to) a new, less-popular protocol, I'd expect more security benefits.

yodasan 2 days ago 0 replies      
So, it seems like the main concern here is that a client will not validate the SSL certificate, so the SSL layer is now manually added into javascript code using the WebCrypto API to prevent this? I see not validating SSL certificates being a potential problem with something like a REST API, but is it common to disable SSL verification at the browser level where you would need to use javascript to do this?
deathanatos 2 days ago 0 replies      
> Most importantly using JWT tokens make it basically impossible for you to experiment with an API using cURL. A major impediment to developer experience.

Why can't a developer do exactly what you did in your second video, which is to save the JWT to a variable, and then use it in the request?

Heck, you could create a quick wrapper "jwt_curl"/"jwt_http" or something that automatically pulled in that variable

There's two big things about this scheme that leave me confused: how do you know what the correct certificate for the client is? Do you just send it over HTTPS? But then, one of your opening premises is that we don't get TLS verification correct and are open to MitM, so this seems to contradict that, or are we hoping that "that one request won't be MitM'd", like in HSTS? (which seems fine)

iffycan 2 days ago 0 replies      
How does this compare with SimpleFIN: https://bridge.simplefin.org/info/developer

SimpleFIN seems simple and still secure. But maybe I'm missing something?

hobarrera 2 days ago 0 replies      
It's still a bit unclear to me how a client generates his certificates and somehow links it to his bank account. The demo shows a web-UI generating it, but would a mobile user have to visit the website to fetch a certificate?
e12e 2 days ago 0 replies      
Let me see if I understand this correctly:

1) Problem: app authors disable TLS (server) cert validation.

2) Solution: give each app author the responsibility of managing and distributing a client side certificate.

Sounds like now you have two problems? In particular, you now have to make sure that every lost/compromised certificate is added to your growing CRL? And you need app developers that demonstrably do not even have the vaguest idea how public key cryptography can be used for authentication to take responsibility for doing this? And there's still no guarantee that they won't disable certificate verification?

Did I miss anything?

makecheck 2 days ago 0 replies      
By logging into a 3rd-party site using Google+, for instance, you remain logged-in to Google when you go to any other web site.

And the authenticator clearly does not require this global behavior: if you immediately log out from a Google page, you remain logged in at the 3rd-party site that you started from. So why doesnt it log you out globally? Probably to convenience Google, at the expense of security when you auto-identify yourself to who knows how many other web sites before you realize what happened.

Logging into one page with one set of permissions should mean LOG INTO THIS PAGE, not REVEAL MY SECRETS TO THE INTERNET.

EGreg 2 days ago 0 replies      
At Qbix we developed a much more secure way than oAuth to instantly personalize a person's session -- and even connect the user to all their friends -- while maintaining privacy of the user and preventing tracking across domains by everyone except those they choose to tell "I am X on Y network" ... it also restores the social graph automatically on every network and tells you when your friends joined one of your networks.
eemph 2 days ago 0 replies      
>The EU is forcing all European banks to expose account APIs with PSD II by end of 2017.

Any reference for this? The text of PSD II is here http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:320... but it's too long and it isn't clear to me whether it is actually ratified.

seanwilson 2 days ago 0 replies      
What bothers me about OAuth is the way you're on one website and are then asked with a pop-up to enter your Gmail or Facebook etc. password as a normal part of the flow. Users aren't savvy enough to check the URL or understand what's going on here so getting them used to this flow is asking for phishing by the look of it. Something that forced two factor authentication would be good.
smarx007 2 days ago 0 replies      
It's pretty strange to see a new authentication protocol (they describe it as authorization protocol, but they do authentication as well), just as W3C's WebID-TLS is being finalised. Oh, did I mention it uses client X.509 certificates as well? And how does the author imagine that banks would rely on his new protocol to ensure non-repudiation?
defiancedigital 2 days ago 0 replies      
I didn't see anything about renegotiation. If clients present their certificates during first handshake, it will lead to security concerns. Attackers could observe client's certificates (extract meta-data, de-ano clients ...). If renegotiation is used it will drastically reduce "Bonus DDOS mitigation"
0xdeadbeefbabe 2 days ago 1 reply      
> The most realistic threat is the client developer not properly verifying the server certificate, i.e. was it ultimately signed by a trusted certificate authority?

From an attackers point of view, this sounds like a very tiny ray of hope. It sounds like a cool feature/vulnerability that will probably be going away soon because it is so easy to fix.

baoha 2 days ago 0 replies      
tl;dr: it forces client to have a certificate so that the server can verify.

This is kind of a pet peeve. Anyone who ignores or wants to disable server certificate verification has to understand the risk.

chrisallick 2 days ago 0 replies      
Has this been tested against a broad user base? It seems rather involved.
gyre007 2 days ago 1 reply      
It's kinda crazy that it has taken so long for someone to actually take an initiative and attempt to make the authentication more secure.

I wonder if this is a custom built solution or if Teller.io is using something like HashiCorps Vault to do the whole SSL cert dance.

Either way, this looks promising.

poorman 2 days ago 1 reply      
Must be British... "authorisation" ?
Understanding the bin, sbin, usr/bin , usr/sbin split (2010) busybox.net
360 points by kylerpalmer  1 day ago   119 comments top 18
m45t3r 1 day ago 7 replies      
For all people that think this is still a issue: it is not:

* Arch Linux: https://wiki.archlinux.org/index.php/Arch_filesystem_hierarc...

* Fedora: https://fedoraproject.org/wiki/Features/UsrMove

* Debian: https://wiki.debian.org/UsrMerge

* Ubuntu: https://wiki.ubuntu.com/FoundationsTeam/Specs/Quantal/UsrMer...

And this is true for quite a long time. I think the last distro that did /usr merge is Debian and this is already 3 years ago. So I find surprising that people still thinks that this is a issue.

The only thing that remains is the difference between bin and sbin that Fedora/Ubuntu/Debian still maintain (Arch Linux simply merged everything).

P.S.: of course, there should be still distros that use a split /bin /usr/bin, /lib and /usr/lib, etc. OpenWRT is one of them. However I think the major distros already migrated.

bbanyc 1 day ago 2 replies      

If you look at the very early versions of the Unix manuals, before even /lib was invented, there was a notion that /bin was the "system" (section 1) and /usr/bin was "user software" (section 6). This fell by the wayside when they ran out of disk space on / and started putting section 1 commands in /usr/bin. At some point the system/user distinction was abandoned and everything but the games got moved to section 1 of the manual.

(Back in those early days /lib files used to be in /etc. So did /sbin. There's still a meaningful semantic difference between programs any user could run vs programs only useful to root.)

On *BSD there is no initramfs equivalent and the / partition still serves its traditional role of the minimal startup environment. And /home never existed before Linux - it was always something like /usr/home as far as I can tell.

nickpsecurity 1 day ago 3 replies      
NixOS is one of the few distros that recognizes and solves this problem:


A solution like theirs should become the default. Their transactional, package management too given the number of times my Linux packages got broken by some ridiculous crap. That even a smaller outfit could knock out two problems, one major, shows these bigger distros could be gradually knocking out such issues themselves.

ChuckMcM 1 day ago 2 replies      

First there was /bin for the things that UNIX provided and /usr/bin for the things that users provided. Part of the system, it lived in /bin, optional? it lives in /usr/bin.

In SunOS 4 (and BSD 4.2 as I recall) the introduction of shared libraries meant you needed a "core" set of binaries for bootstrap and recovery which were not dynamically linked, and so "static" bin or /sbin (and its user equivalent /usr/sbin) came into existence. Same rules for lib. /lib for the system tools, /usr/lib for the optional user supplied stuff.

Then networking joined the mix and often you wanted to mount a common set of things from a network file server (saved on precious disk space) but there were things you wanted locally, that gave use /usr/local/bin and /usr/local/lib. Then in the great BSD <==> SVR3.2 merge AT&T insisted on everything optional being in /opt which had a kind of logic to it. Mostly about making packages that could be installed without risking system permission escapes.

When Linux started getting traction it was kind of silly that they copied the /bin, /sbin, /usr/bin, /usr/sbin stuff since your computer and you are the only "user" (usually) so why not put everything in the same place, (good enough for Windows right?)

File system naming was made more important by the limited permission controls on files, user, group, and other is pretty simplistic. ACLs "fixed" that limitation but the naming conventions were baked into people fingers.

The proliferation of places to keep your binaries lead to amazingly convoluted search paths. And with shared libraries and shared library search paths even more security vulnerabilities.

IshKebab 1 day ago 1 reply      
I wonder what is more likely in our lifetimes - a sane Linux filesystem layout, or viable fusion power. Honestly I'm not sure.
Annatar 1 day ago 2 replies      
The elephant in the room is /opt, /etc/opt, and /var/opt. The System V and filesystem hierarchy specifications say that those locations are, and I quote, "for third party and unbundled applications". Yet some distributions, like for instance Debian or Ubuntu, do not even include them, precluding commercial software vendors from ever delivering software for those operating systems (no, an unbundled application can never be delivered into /usr, because that is vendor's namespace, and an updated version from the vendor might very well bring a production application down).

/opt: application payload

/etc/opt: application's configuration files

/var/opt: application's data.

For applications with more than two configuration or data files, it is a good idea to create /etc/opt/application and /var/opt/application.

If your company is mature enough to understand what system engineering is, and employs dedicated OS and database system engineering departments, /opt/company, /etc/opt/company[/application], and /var/opt/company[/application] become very practical. If this approach is combined with delivering the application payload and configuration management as operating system packages, one only need worry about backing up the data in /var/opt, and that's it.

dredmorbius 1 day ago 1 reply      
Reasons why you still might want to keep / and /usr isolated:

1. NFS mounts. Your local or initial BOOTP image has a minimal root, your (non-root-writeable, BTW) NFS /usr has Other Stuff.

2. Mount options. Root is often (and perhaps still must be -- /etc/mtab for example -- I've stopped closely tracking discussion to make root fully read-only) writeable. It may also require other mount permissions, including device files and suid. Other partitions don't require these permissions, and there is some Principle of Least Privilege benefit to mounting such partitions without them. /usr requires suid, but not dev, and may be nonwriteable except for OS updates.

3. Recovery partition. I'll frequently stash a second root partition, not typically mounted. It has minimal tools, but enough to bootstrap the system if the primary is hosed for whatever reason (more often myself than any other). Without a clean / /usr split, this becomes more complicated.

josephscott 1 day ago 2 replies      
I like that FreeBSD dedicates a man page, HIER(7), on this topic - https://www.freebsd.org/cgi/man.cgi?query=hier - "layout of file systems"
teamhappy 1 day ago 2 replies      
/{bin,sbin} is for stuff you cannot live without (e.g., sh, mkdir, chmod, rm, ln, etc.)

/usr/{bin,sbin} is for stuff you can live without but expect to be there (e.g., make, grep, less).

/usr/local/{bin,sbin} is for stuff you/your local admin installed (e.g., mosh, cmake).

Also, I use $HOME/{bin,sbin} for wrapper scripts, binaries that I need but don't want to install system-wide (single-file C utils that come without man pages, stuff like that).

I'm not sure where the confusion comes from and I don't really see any advantage in merging / and /usr. On the flip side, I do think there's value in keeping /{bin,sbin} as small as possible (because that stuff has to work).

takeda 1 day ago 3 replies      
MacPorts use /opt/local :)
shirro 1 day ago 1 reply      
The /bin vs /usr/bin split makes perfect sense but I always thought /sbin was superflous to happy to see it being deprecated by many distros.

I expect with the increasing moves towards app stores and sandboxing on all platforms that the days of installing packages contents all over the filesystem are limited and things like xdg-app are probably going to take over with an app being mounted into the filesystem in a sandbox as it is run.

RijilV 1 day ago 0 replies      
One outcome of this that drove me nuts back in the day on Debian system was libpcre was installed in /usr and grep was in /bin. This meant perl regex were not supported because the maintainers of Debian didn't want the dependency on /usr from things in /bin, and didn't want to "bloat" / with something as distasteful as libpcre.
SquareWheel 1 day ago 4 replies      
How badly would things break if we tried to "fix" this split today?
paxcoder 1 day ago 1 reply      
I want to react to the signature FUD: GPLv3 is clearly a superior copyleft license to GPLv2 despite Linus' latently permissive opinion on the anti-TiVoization clause.
rhabarba 1 day ago 0 replies      
Today I learned that /usr actually means user, not Unified System Resources. Damn, one less thing I can nitpick about.
chrisamanse 1 day ago 1 reply      
I thought 'usr' stands for "Unix System Resources"?
chris_wot 1 day ago 0 replies      
Well this partially answers my question at https://news.ycombinator.com/item?id=11647487
SFJulie 1 day ago 0 replies      
I love the quote in the signature :

GPLv3: as worthy a successor as The Phantom Menace, as timely as Duke Nukem Forever, and as welcome as New Coke.

Phrack 69 released phrack.org
375 points by ChrisArchitect  2 days ago   52 comments top 16
wcummings 1 day ago 1 reply      
Whoa, wasn't expecting to see this, it's been a while.

Phrack has a technical depth you don't often see on HN, it's a shame there aren't more people producing content like this.

Tiksi 1 day ago 1 reply      
I am so happy to see this today. I've missed this style of writing and publication in today's noise of mostly self promoting blog posts pretending to be informational. Hopefully they keep up with the paperfeed and continue to release issues.

Mirrored for anyone behind a corp firewall that might block phrack.org: http://paste.click/s/amfVqd

fitzwatermellow 1 day ago 2 replies      
I enjoyed "The Fall of Hacker Groups". Clearly there is some deep nostalgia for the 2600 days out there ;)

"The only attitude consonant to our search for a comfortable, safe life is to constrain ourselves to ourown limitations, ignore the intelligent life out there, and surrender to the mediocracy that our society has condemned our leisure time to."

fauria 1 day ago 0 replies      
Here you can find many of those text files that used to sit on BBSs and floppy disks back in the day: http://www.textfiles.com/directory.html
victorhugo31337 1 day ago 1 reply      
Couldn't be happier that Phrack is still alive and well--still miss BSRF:


m00dy 1 day ago 0 replies      
"Smashing stack for fun and profit". Everything started with that for me.
__jal 1 day ago 0 replies      
It makes me happy that Phrack is still kicking.

I stumbled over Phrack around issue 20-something, and have read every issue since then (and at least most of the earlier ones), something I can't actually say about the two magazine subscriptions I've kept since becoming adult-shaped.

0x0 1 day ago 1 reply      
Are they working through a backlog? The OSX article mentions 10.8.2 as being the most recent release...
noir-york 1 day ago 0 replies      
"Smashing the stack for fun and profit", the tear drop attacks against NT, IGMP bug in Windows 95, ida pro, ollydbg, C, asm, finding help, and "test boxes" on irc...

Geez a few years have passed since then!

jcoffland 1 day ago 0 replies      
Awesome. I first started reading Phrack back in the early 90s along with hex40 and Cult of the Dead Cow. Those were the days.
PavlovsCat 1 day ago 1 reply      
The conclusion of the article titled "The Fall of Hacker Groups"...

> Furthermore, we dread the thought of being alike, of sharing multiple views and opinions. As such, we are turning progressively judgemental of who we should be partnering with, on the basis that "they do not understand". [..] No one ever feels like we do. They are not to be trusted and we do not have the time for them. The only attitude consonant to our search for a comfortable, safe life is to constrain ourselves to our own limitations, ignore the intelligent life out there, and surrender to the mediocracy that our society has condemned our leisure time to.

...reminded me of this:

> Even those of the intelligent who believe that they have a nostrum are too individualistic to combine with other intelligent men from whom they differ on minor points.

( from http://russell-j.com/0583TS.HTM )

Even knowing this, and knowing it's silly, doesn't really change it; it's a more ingrained habit than that, at least for me. But it's worth a mental note to self and those whom it may concern :)

yeowMeng 1 day ago 0 replies      
I really enjoy this type of stuff, but always wonder why hateful words sometimes get melted in.

Edit: changed to existential claim.

kriro 1 day ago 1 reply      
The Solar Designer prophile was a fun read:)
BorisMelnik 1 day ago 1 reply      
very cool, Phrack was one of my favorite publications along with 2600. I am really happy to see this. Also there was a writer for Phrack from a long time ago I used to be friends with IRL, does anyone know who I could contact to help find him?
daveloyall 1 day ago 2 replies      
What is PO?
sboselli 1 day ago 1 reply      
Is there a way to get it in an RSS feed?
Microsoft no longer allows admins to block Windows Store access in Win10 Pro zdnet.com
293 points by walterbell  2 days ago   239 comments top 25
Someone1234 2 days ago 15 replies      
If Microsoft wants SMBs to use Enterprise then make the Enterprise edition more easily available to SMBs, don't try to force them to move to it by making petty little changes to make their life more difficult.

I can go on the Google Apps website right now and buy myself seats with a few clicks and a few minutes. If I want to buy Windows Enterprise licenses it will take weeks, cost an unclear amount (at the onset), and I'll have to talk/negotiate with pointless sales drones.

I worked for a startup, under 20 machines, I tried to buy then Windows 7 Enterprise. Microsoft's partners were super unhelpful, disinterested in a small account, refused to provide clear pricing, and I was getting upsold even before we got the basics squared away ("I'll just add on 20 CALs, a Windows Server license, and let's talk exchange!"). Ultimately we just gave up, and used Windows 7 Home(!) for three years.

People want to give Microsoft money, but Microsoft is intent on making the entire thing as painful as possible and their licensing as obtuse as possible. Office 365 Business gets a lot of shit, but it is a dream come true for startups, you pay one cost, and one user gets their Office license key, email, and some cloud storage taken care of.

Where is the Windows version of Office 365? Why can't I just pay a per user fee and get one Windows Enterprise key, the CAL, and Azure-based AD?

Time is money, and Microsoft likes to waste a lot of time. I'd prefer to spend a few dollars more a year and have a simple streamline process of licensing, than spend weeks being jerked around just to maybe get a few bucks off of a fake price anyway.

camperman 2 days ago 3 replies      
I always snigger at the formula of Microsoft's public statements. You just know that in the first paragraph it will claim - disingenuously - to be doing the exact opposite of whatever it's accused of. And sure enough the very first sentence is:

"Microsoft is focused on helping enterprises manage their environment while giving people choice in the apps and devices they use to be productive across work and life."

You can take this sentence, and without knowing any other details at all, figure out that the company is somehow preventing enterprises from managing their environments properly or restricting app and device choice or both.

drglitch 2 days ago 1 reply      
Microsoft enterprise/SMB sales process (via channel partners) is a laughable disaster.

Recent task: Run a Windows Server VM on Azure, with 4 remote desktop users connecting in.

Result: almost a MONTH of back and fourth with THREE different MSPs since none of them knew details of proper licensing. In fact, even Azure support did not know licensing terms and said just to contact the MSPs, who in turn advised we contact Azure support. In the end, after a couple of days of googling and reading obscure MSDN entries, we THINK we got the right licensing approach.

Oh, and total cost difference between different MSPs on even such a small order was over 40%.

Sadly, its currently a classic example of "please take my money" and the company doing everything in their power not to. Until microsoft clears up their licensing terms and makes pricing transparent, they will be hated.

GigabyteCoin 2 days ago 4 replies      
And this particular sysadmin-for-my-family will no longer allow windows to be installed on any of our computers whenever they are in need of a reinstall.

Windows is becoming more and more like Facebook. Too many users changed a setting you don't agree with? Just block access to that setting or call it something else to confuse enough people to the point that the numbers are "good enough" for management.

Almost every time I visit my parents, my mother's Windows 10 laptop has reverted at least one of the changes I have made.

I used to keep a copy of Windows available via dualboot on all of my laptops, just in case I needed to print something in a remote location where whichever flavor of linux I was using didn't support. Not anymore. Linux Mint serves that requirement just fine.

FuriouslyAdrift 2 days ago 5 replies      
We are tied to Microsoft due to a multi-million dollar ERP. I have frozen at Win 8.1 (software assurance contract). I love server and maybe once Server 2016 is out this Fall/Winter, I can circle back around but the 2 Win10 machines we have (one is mine) tripped every security protocol we have (we do some stuff for foreign and local defense contractors). Thi sis the enterprise edition. In the end, I block a few thousand domains and entire netblocks within and without our networks which completely breaks Cortana, Store, etc. along with just about every Microsoft website. It's a pain. I'll be moving back to DragonflyBSD ASAP on my desktop and running VM's whenever I need to hop into Windows.
hackuser 2 days ago 4 replies      
I want devices that I, the owner, control, whether I'm a small business or enterprise or individual. This is important for many reasons, from security to freedom-to-tinker to using the device I own in the way I want.

It was once an accepted standard in IT. Now, can anyone name a current handheld or desktop system that provides end-user control?

If you don't think security, including privacy, is important, consider what a U.S. president with fascistic tendancies would do with all this access to citizen's devices and data (and how many companies would risk their enterprises when he leaned on them?).

TheRealDunkirk 2 days ago 6 replies      
The ball is back in Apple's court. They seem to take the end user more seriously, but they've been mixed. I made a comment taking Microsoft to task when their surreptitious telemetry came to light, and someone pointed me to proof that Apple was doing about the same thing. This is Apple's chance to continue to distance themselves from their competition. They've done well standing up for privacy, with the recent FBI demand to decrypt iPhones, but this is a chance to go further.

Man, I really wish Google would release their desktop Linux. Ubuntu is OK, but someone with pockets like that could finish the job, and make a credible, consumer-accessible, 3rd alternative to keep BOTH #1 and #2 on their toes. If I could just run Linux-supported games with the same performance as under Windows -- I'm not even talking Windows games under Wine -- I might finally get rid of my Windows partition to get away from such things. Valve has got to be working on a Linux distro, which they will release on their SteamBox (along with Half-Life 3, mark my words), but who knows when THAT will be.

rchowe 2 days ago 3 replies      
The reasoning behind this made a lot more sense to me once I started to do a back-of-the-envelope calculation.

There can't have been that many end users who had Windows 10 Pro and went into group policy to turn off the store. So you're looking at small businesses who were using PCs with Win10 Pro on them (likely that came with the PC) that were turning off access to the store but can't any more. The IT admins for these companies are the people Microsoft wants to upsell.

Lets say that there are 500 businesses who care about this feature each with an average of about 20 PCs (probably a high estimate for PCs, low estimate for number of businesses). That's 10000 PCs that Microsoft could potentially convince to upgrade, at (a quick guess based on Google) $120/PC, to Win10 Enterprise, or a potential $1.2M more in revenue that doesn't cannibalize one of their other businesses (assuming more changes to differentiate Pro vs Enterprise). Probably the people who will upgrade are people with factory computers running Win10 Pro.

And for people that don't upgrade, they get to promote their app store. Win-win.

overgard 2 days ago 0 replies      
Who would have thought that an update policy that allows a corporation to silently update your computer whenever they feel like it would be abused?
chris_wot 2 days ago 0 replies      
So basically, what Microsoft are really doing is forcing admins to block access to the app store through their firewall or proxy. Or setup local workstation's firewall via Group Policy - to block their app store.

Or remove the app store entirely, which is technically possible as it's not an essential part of Windows. (if it is, then I invite them to review the times they were forced to state that Internet Explorer was an essential part of their operating system during anti-trust...)

I don't think they've thought this one through very well.

jalami 2 days ago 2 replies      
Microsoft isn't the only guy doing this. I understand enterprise/professional customers have gotten exceptions for years, but for everyone else this is common practice on almost all other platforms. I still think it's a bad practice.

Microsoft is just following the other companies that are winning and somehow doing so without pissing their userbases off. Their main asset, as I see it, are people that cannot or will not switch. So as it is for most companies with a semi-loyal userbase: lock it down before the garden empties too much.

IOS has an appstore, Android has an appstore, Mac has an appstore, Chrome has an appstore, Firefox has an appstore, Ubuntu kind of has an appstore. Firefox, iOS and Chrome don't allow you to install outside of their appstore without running different builds. Android makes it difficult, removing it is even more difficult and you lose half your phone in the process. Sure there's homebrew, f-droid, cydia and chocolatey for hackers, but that's a tiny subset. Windows really wants control like everyone else. The internet has changed a lot since the decentralized software and hardware days Microsoft is used to. Microsoft doesn't get to sell their user metrics, control what users install on their systems or where they're installing from. They don't get to charge uploaders or put fees on downloaders/purchasers. The Windows store is pretty much a flop at this point, but they want it to be the canonical way to install software on Windows like every other platform.

Not a Windows problem really, they just get the negative press that every system should get for trying to force people into a garden. If it gains steam years down the road, I could see them pull a Firefox and lock down external installs without 'approval' for security.

Just a few weeks ago I bought a Microsoft Miracast dongle, OS independent or so it claimed. Only way to configure it was to have a Windows10 computer and download the driver/configuration software from their Appstore. I no longer own it. I really don't think this is an isolated problem though.

Edit: Clarification

geographomics 2 days ago 0 replies      
You can uninstall it with an administrative Powershell using this command:

 Get-AppxPackage Microsoft.WindowsStore | Remove-AppxPackage
Would this not continue to work?

thothamon 2 days ago 1 reply      
Just one more reason for me to avoid Windows whenever possible, for myself and for my companies.

I grant Microsoft today is a much better corporate citizen than it was 15 years ago, and I appreciate that. But a move like this feels very much like the bad old days to me.

colemannerd 2 days ago 4 replies      
Unlike many comments, I really like this. I think it unthinkable that businesses reduce employee's productivity by locking down their machine. In the days when Microsoft wasn't checking the security of applications, this was understandable. With a managed and secured store, this is security for the sake of security. If you believe you should disable something just for security without examing the value of the feature, why are you letting users access the internet?
tdkl 2 days ago 3 replies      
How can someone still trust MS ?
satysin 2 days ago 1 reply      
They should rename it to Windows 10 "Professional" Home Edition
bedane 2 days ago 0 replies      
This reminds me of the mandatory updates for the cheapest windows 10 version.

In the era of bloated/invasive OSes and arbitrary pricing according to the customer's profile rather than according to the value of the product, you don't pay more for more features.

You pay more for the right to deactivate the unwanted features.

Some day you will have to pay for disabling all the "telemetry" and "unique advertiser ID" stuff. Or maybe that's already the case, I didn't bother checking.

Esau 2 days ago 0 replies      
I have historically liked Windows but I shudder what is becoming of it. Every time I turn around lately, Microsoft is taking administrative freedom away from users and businesses.
jheriko 2 days ago 2 replies      
the criticism here is imo unfounded. MS are exceptionally supportive of developers in my experience, and this feels like part of that.

since blocking access to the store is in fact an enterprise requirement... why not restrict it to the enterprise edition?

... and besides that. how about employers trusting their employees anyway?

this is much less than what other platform holders have done in this respect too... Apple being foremost amongst the worst in this category - and yet still receiving fanboy support to the level of religiosity.

mtgx 2 days ago 0 replies      
Why even have a Pro version? Just to charge twice as much for BitLocker, which should be default in every Windows version, just like it is on any other operating system?
hardlianotion 2 days ago 0 replies      
That's - er - disingenuous.
venomsnake 2 days ago 1 reply      
Gee microsoft ... if only admins knew that there existed firewalls.
rasz_pl 2 days ago 1 reply      
No they dont, blocking windows store is one firewall rule.
ambulancechaser 2 days ago 1 reply      
I think its a general rule that anytime you derisively mock someone's name with these snide, "clever" puns--or whatever you want to call them--any argument you could make is ignored.
CyberDildonics 2 days ago 0 replies      
How many bait and switches will people put up with?
Every top 5 song from 1958 to 2016 polygraph.cool
324 points by pzaich  2 days ago   104 comments top 29
robbiemitchell 1 day ago 1 reply      
So. Cool.

1. Playing around in nostalgia territory and am amused by how well pop music re-uses itself over and over.

For instance, in May-Oct 1993, #1 spots were dominated mostly by three songs:

8 weeks: "That's the Way Love Goes" (Janet Jackson) --> samples "Papa Don't Take No Mess" by James Brown (1974)

7 weeks: "(I Can't Help) Falling in Love with You (UB40) --> cover of "We Can't Help Falling in Love" by Elvis (1961)

8 weeks: "Dreamlover" (Mariah Carey) --> samples "Blind Alley" by The Emotions (1972)

2. Compare that kind of pop-culture inertia to, say, 1974, where the top spot cycled much more frequently, yet those songs have been sampled heavily in the years since.

tinkerdol 2 days ago 0 replies      
For those interested in discovering music over the decades from different countries: http://radiooooo.com/
JusticeJuice 2 days ago 1 reply      
Shameless plug - if you want more nostalgic music, check out my project http://thenostalgiamachine.com/
madengr 2 days ago 7 replies      
Is it me, or did the 80's just have better music. Maybe it's just the when you become cognizant of music. My first album was Men at Work. Wore that tape out, and the music is STILL good.
transfire 1 day ago 0 replies      
I listened all the way up to the 90s. I have to wonder how this chart is determined. While obviously many of the popular songs of the period trend, there are some places where it doesn't seem right. For example Michael Jackson's Thriller never broke #3, and MJ barely made the top ten week chart until BAD came out. That can't be right. I checked Wikipedia and it says all MJ songs from that album made #1.
delgaudm 2 days ago 2 replies      
What I would love to see is how many plays/streams each number one got "this" week by comparison. It would be interesting to see the longevity of some of the songs as a replacement for "good", i.e. are there hit songs from the 60's or 70's that still get way more plays than hit songs from the 90's or 00's.
pathsjs 2 days ago 4 replies      
Is there a way to hear it in the background? Every time I switch tabs, it stops playing. Also, it sometimes mixes up songs from different periods (1958+1997)
vlehto 2 days ago 7 replies      
Interesting how some themes changed completely. You would never expect to hear "ballad of green berets" as top song anymore.

I think there is noticeable deterioration in music quality of the top hit during the past millenia. I guess it's combination of several factors: music getting cheaper, youngsters getting more money. And people in their thirties having more and more options on what to listen, so they are less likely to herd the record store on any one given single. I'd say music was still good in 86. (I was born then.)

From 2001 onwards it seems the teens started to swarm billboard social 100 or something. Boybands are out and music is suddenly OK again.

S_A_P 1 day ago 0 replies      
I got to the early 70s before i ran out of time. The 60s were an incredible time for music. The sheer variety is amazing. Pop, soul, bossa nova, rock. I think the 90s had a smaller creative burst with rock and hip hop and to some degree dance/electronic genres. I'm ready for another music revolution.
mwexler 2 days ago 3 replies      
I enjoyed this,but was surprised to see how the older stuff (1970s) skyrocketed to the top, then PLUMMETED off after a week. Compare to 2000s where a song can stay at top for a month, and may drift down.

As others point out, changes in how tracked (radio play vs. download sales vs. streams) can change this, but it is fascinating to see visually.

philwelch 22 hours ago 0 replies      
One fascinating point is the sudden collapse of disco in the summer of 1979. This may be related: https://en.wikipedia.org/wiki/Disco_Demolition_Night
briHass 1 day ago 1 reply      
It's easy to forget just how well some modern artists fine-tuned the hit song production process. Everyone thinks of The Beatles, but Mariah Carey, Rihanna, and Beyonce (if we include her Destiny's Child time) all beat them for weeks at #1.

Going by weeks at #1, Mariah Carey is probably the most popular artist of all time (i.e., since Billboard started recording). She also had a collab ("One Sweet Day") that still holds the record for most weeks at #1. Not only her raw numbers, but her career delivered #1s spanning 3 decades.

I guess I never thought that she was as big as she is/was.

oneloop 2 days ago 1 reply      
Is it just my imagination, or are black people overrepresented in the top 5?
lomnakkus 1 day ago 0 replies      
This is absolutely wonderful. Such a great idea and execution!

I have to wonder how many of these Nile Rodgers[1] is responsible for.

[1] https://en.wikipedia.org/wiki/Nile_Rodgers

davidw 1 day ago 0 replies      
Hah, cool - Nel Cielo Dipinto di Blu (aka Volare) was the last Italian song to get anywhere on US charts.

With good reason IMO - I'm not much of a fan of Italian pop. I go for a bit more esoteric stuff like Fishbone, Oregon's own Cherry Poppin' Daddies, and Los Fabulosos Cadillacs.

konart 1 day ago 0 replies      
Very cool project.

Had to stop the music at first. 1990s' western music (pop music that is, can't see anything else there) was... well - bad. Didn't even realize up until now.

dbalan 2 days ago 2 replies      
Its quite easy tracking song popularity down to number plays, thanks to the streaming services, but thats now. Any idea where the old data is coming from? Couldn't find any links on the site.
vladaionescu 1 day ago 0 replies      
I wish someone made a Spotify playlist for this.
santa_boy 2 days ago 1 reply      
FWIW. I was wondering the source. This is top 5 per Billboard (in the title). Is Billboard the global leader in tracking hits? (just inquisitive)
gerjomarty 2 days ago 0 replies      
I love this a lot, but every time I open it or switch back to it, it just makes me want to go and listen to Skee-Lo's I Wish.
danielweber 2 days ago 0 replies      
Is there a way to change the resolution from every day to every week or every month?
golfstrom 2 days ago 0 replies      
If you enjoyed this, I highly recommend the podcast episodes from The Gist (from Slate) where they talk about Billboard #1 hits from various years.


barbs 2 days ago 0 replies      
I like the concept though I kind of wish I could zoom out and see more than the top 5
mosselman 2 days ago 0 replies      
This is beautifully made. It works very well.
kevindeasis 1 day ago 0 replies      
Good job, this is really cool
taigeair 2 days ago 0 replies      
Very nice shameless plugs. Wouldn't mind seeing more.
ck2 1 day ago 0 replies      
As someone who never listens to the radio so I don't know what is currently being paid-to-play this was very educational and entertaining.
akhilcacharya 1 day ago 0 replies      
visarga 2 days ago 2 replies      
I don't like most of the top #1 hits. Averaging the masses of people will give average music.
.NET framework ported to NetBSD github.com
263 points by nbyouri  2 days ago   37 comments top 9
nbyouri 2 days ago 4 replies      
NetBSD is a very portable BSD-licensed operating system.This is CoreCLR, the open source .NET framework from Microsoft.

Why is this important? Programs for NetBSD can be run in a stripped down NetBSD kernel and libc (a unikernel) called rumpkernel, which can be run on any platform, including on bare metal. It is also extremely minimal - an operating system image for nethack was a mere 4MB.

Rump can also run on any architecture that NetBSD runs on (including MIPS, VAX, m68k, as well as standard platforms like amd64 and ARM) and also on those without fully NetBSD support, but just with a functional C compiler (like RISCV).Many people use rump as an extremely lightweight container - it can run on Linux and others.

Rump was recently used to port NetBSD audio drivers to GNU/Hurd.

Caveats: doesn't run in rump yet, but this is a big step towards it. .NET is also not self-hosting yet, and requires assemblies (CoreFX) to be cross-compiled from Linux.

The work was done almost entirely by Kamil Rytarowski, a NetBSD developer, in his free time. Good job Kamil!

josteink 2 days ago 2 replies      
From a former contributer's point of view, it's fun checking the commits it took to make this happen. From what I can tell, some of the work I put in to get CoreCLR work on FreeBSD (which again built on others work to make it run on Linux and OSX), now makes it happy to run on NetBSD too.

This is why I like open-source. If my toaster one day runs .NET, I'll be able to say I helped make it happen ;)

mdip 2 days ago 1 reply      
As a .NET developer, and lover of all things C#, I couldn't be more happy. There's always been that nagging feeling in the back of my mind that I'm pigeonholed with a language I can't easily use on other platforms (Mono's been around for a while and I've used it but I ran into a variety of problems the last time I targeted it[1]).

"Officially" supporting alternative platforms gives me the opportunity to write code in a language I prefer for the handful of Linux boxen I've had here. I'm looking forward to poking around with this in NetBSD. I've had a box doing virtually nothing here for a while running an old version -- guess it's time to upgrade!

The thing that still makes me hang on to other languages is uptake. I haven't looked in a little while, but back when Mono was "it", the number of applications available outside of Windows was pretty small. Basing an application on Mono tended to come with grief lobbed at the developer[2]. Has any of that changed or shown signs of changing since CoreCLR was released?

[1] I'll be the first to admit that these problems had workarounds I wasn't familiar with at the time and though I had no similar problems with a handful of CoreCLR apps I've written, I was far more researched at my second attempt and would have probably been successful with either had I bothered to try.

[2] I tried to find examples - my memory is not serving me well but I thought a component of OpenSUSE or something along those lines was written targeting Mono and was later scrapped for another technology due to weak community support. The best example of complaints I could find is http://www.linuxuser.co.uk/features/mono-a-gratuitous-risk-t...

kingmanaz 2 days ago 0 replies      
NetBSD scales well. See SDF.org for working example. Hundreds of users from around the world online simultaneously at any given moment, thousands of Unix commands and services being used and consumed, and despite all that, a great security record and good performance.

NetBSD really is a nice system and deserves to be as popular as the other BSDs ( Free-, Open-, etc. ).

bloaf 2 days ago 1 reply      
I've been reading around to try to get a clearer picture of what is going on under the hood in Microsoft land. Essentially, I'm curious what it would take to build an OS like Inferno, but that provides the CLR as a VM for CIL instead of Limbo.

1. Am I correct in thinking that this is sort of what NetBSD has done?

2. Does this mean that through Roslyn + RyuJIT, NetBSD gets to run programs written in C#, F#, and other visual-* languages?

3. How close to support for things like WPF/UWP does this get NetBSD?

CiPHPerCoder 2 days ago 1 reply      
Can I be the first to call it .NETBSD?
Hydraulix989 2 days ago 0 replies      
Wrong way around. Generally we port NetBSD to something else. This headline should be "NetBSD ported to .NET framework"
toomanythings2 2 days ago 0 replies      
And Microsoft loves Linux, too.
iamabhi9 2 days ago 0 replies      
Great work :)
A beginner's guide to Big O notation (2009) rob-bell.net
307 points by g4k  3 days ago   65 comments top 13
krat0sprakhar 2 days ago 1 reply      
On the topic of efficient algorithms, I recently read a nice note in an Algorithms textbook -

> It would appear that Moores law provides a disincentive for developing polynomial algorithms. After all, if an algorithm is exponential, why not wait it out until Moores law makes it feasible? But in reality the exact opposite happens: Moores law is a huge incentive for developing efficient algorithms, because such algorithms are needed in order to take advantage of the exponential increase in computer speed.

Here is why. If, for example, an O(2n) algorithm for Boolean satisfiability (SAT) were given an hour to run, it would have solved instances with 25 variables back in 1975, 31 variables on the faster computers available in 1985, 38 variables in 1995, and about 45 variables with todays machines. Quite a bit of progressexcept that each extra variable requires a year and a halfs wait, while the appetite of applications (many of which are, ironically, related to computer design) grows much faster. In contrast, the size of the instances solved by an O(n) or O(n log n) algorithm would be multiplied by a factor of about 100 each decade. In the case of an O(n2) algorithm, the instance size solvable in a fixed time would be multiplied by about 10 each decade. Even an O(n6) algorithm, polynomial yet unappetizing, would more than double the size of the instances solved each decade. When it comes to the growth of the size of problems we can attack with an algorithm, we have a reversal: exponential algorithms make polynomially slow progress, while polynomial algorithms advance exponentially fast! For Moores law to be reflected in the world we need efficient algorithms.

mattlutze 3 days ago 2 replies      
Looks like someone finally got it to the front page:


Lxr 2 days ago 2 replies      
> O(N) describes an algorithm whose performance will grow linearly and in direct proportion to the size of the input data set

Nitpick, but this is technically wrong. There needs to be a "no faster than" somewhere in there. Merge sort is, for example, included in O(2^n) by the actual definition of big O (vs big theta, which is defined as a tight bound). So there exist algorithms that are O(N) that don't grow linearly.

f(x) is O(g(x)) just means there is some constant K such that f(x) <= Kg(x) for all x beyond some value. This has nothing to do with 'worst case performance'.

njharman 2 days ago 1 reply      
I like this page http://bigocheatsheet.com/

Espcially the graph which hammers home how quickly things go wrong.

nayuki 2 days ago 2 replies      
The article is mostly good, but I have one nitpick.

> An example of an O(2^N) function is the recursive calculation of Fibonacci numbers

No, the naive recursive evaluation of the Fibonacci sequence has complexity O(1.618^N) (or just O(Fib(n)). It is unequal to O(2^N) because the base of the power is different.

btilly 2 days ago 0 replies      
Common mistake. When people say O(2^n) they USUALLY mean something closer to 2^O(n).

The reason is that f(n) = O(g(n)) means that for some constant k and integer N, if N < n then |f(n)| < kg(n). In other words "grows no faster than a constant times". However when you've got exponential growth the slightest of things can cause the exponent to be slightly different, or there to be a small polynomial factor.

That was the case in his example. (The exponent was different.)

minikites 3 days ago 4 replies      
The best layman's example of O(log N) I've heard is finding a name in the phone book. You open it around the middle, select which half gets you closer, and repeat. It's then easy to "get" why it's not a big deal if it's a huge phone book versus a tiny phone book.
nathan_long 2 days ago 0 replies      
Nice! Very succinct and clear. I wrote an intro myself, as someone who'd recently learned the concept. Wordier, but might be helpful if you think like I do. http://nathanmlong.com/2015/03/understanding-big-o-notation/
curiousDog 2 days ago 1 reply      
I think the nicest way to learn this if you don't have a formal CS education is to still pick up a Discrete Math text book (like Chapter 3 in Rosen) and then read chapters 3 and 4 in CLRS.
gameguy43 2 days ago 0 replies      
Super interesting comparing this to the one we have on Interview Cake:https://www.interviewcake.com/article/big-o-notation-time-an...

I like how Rob Bell's piece has headings with the most common time costs--sorta makes it easier to see at a glance what you're going to get, and I imagine gives folks a quick sense of, "Right, I've seen that! Been /wondering/ what that means."

vvanders 3 days ago 6 replies      
> O(N) describes an algorithm whose performance will grow linearly and in direct proportion to the size of the input data set.

Argh, I hate this every time I see Big O notation covered.

Big O != performance. If you have an O(N) algorithm that walks the data set in the order that it's laid out in memory(assuming contiguous data) it will beat your O(NlogN) and sometimes even O(logN).

[edit]meant to omit nlogn, that's what I get for any early morning rant pre-coffee.

Radix Sort is the classic example I always bring up. On machine word size keys, with a separate pointer look-up table(to get final value) it will beat QSort, MergeSort and all the other NlogN sorts by 10-50x. This includes having to walk the data 3-4 times depending on how you want to split the radix to line up with cache sizes.

Friends don't let Friends use Big O to describe absolute performance.

emodendroket 2 days ago 0 replies      
It would be sensible to put the terms like "constant" and "logarithmic" in there, IMO.
kevindeasis 2 days ago 0 replies      
Is there a beginner's guide to proving different time complexities?
Tesla crash after flying 82 feet in the air shows importance of a crumple zone electrek.co
247 points by vinnyglennon  19 hours ago   135 comments top 24
reitanqild 8 hours ago 2 replies      
Tangentially related but the comments are the best part IMO:

> Anton Korn 2 days ago

> I believe that there should be some sort of a software based limit on the maximum acceleration for new or unexperienced drivers.

> No 18 year old should be given the option to drive a supercar. it is just too dangerous.

>> Kerry Manning > Anton Korn 2 days ago

>> I believe that the software has been around for quite a while. It is an advanced neural network commonly referred to as "a parent". :)

>>> Going Knightly > Kerry Manning a day ago

>>> Said software, unfortunately, only controls the vehicle as long as the drivers are within observational range. Once the vehicle leaves said range, the software is defunct and is running on the backup operating system "Wishful Thinking 2.0".

>>>> Kerry Manning > Going Knightly a day ago

>>>> Actually parenting is more about teaching your kids to do the responsible thing when nobody is watching. Without that it's not "parenting" it's "babysitting"

sandworm101 7 hours ago 5 replies      
(1) "It takes a lot of speed to flip a 5,000 lbs Model S with a low center of gravity."

No it does not. Once a car is off road, on non-level ground or as in this case flying, it can roll at any speed. Push a Ferrari off a cliff and it might roll, flip and do somersaults no matter its centre of gravity.

(2) That car shows a very bad sign: impacts on both front an rear. That means multiple impacts separated by some period of time. The problem with airbags is that they can only deploy once. Same too for crumple zones. What saved these kids was most likely the seatbelts, the only safety feature that remains functional after the initial impact. This is why I am against the new trend of shock-absorbing seatbelts with stitched expansion zones, what rock climbers might call screamers. They don't work twice.

Forget the fancy safety features. The humble belt is more important than all of them put together. If you are going to roll a car, A good seat and a 5-point restraint is better than a hundred airbags.

Colin_M 11 hours ago 2 replies      
"SpaceX successfully recovers teens after launch"
ascorbic 8 hours ago 2 replies      
Not quite on topic, but this headline is a good example of false precision in converted units. Did they precisely measure it to the foot? Of course they didn't. The article makes it clear it's 25 metres, which is probably an estimate within 5-10 metres. 80 feet would have been a more helpful conversion.
krschultz 17 hours ago 2 replies      
Not to be taken as a knock on Tesla, but from snowboarding I've learned the pain is not in the distance you fly, but the sudden stop at the end. Tumbling across an empty field is about the best case scenario. An unfortunately placed tree would be fatal at half that speed.
avar 18 hours ago 2 replies      
I suppose it doesn't matter much in practice because few things have the density of an engine block, but I wonder if the Tesla is crash tested with luggage in the frunk, or always with it empty.

People are going to be putting stuff into the frunk, even stuff that may be really solid and pointy and likely to breach into the cabin in the event of a crash.

Is it especially armored to deal with those situations? Or should Tesla owners generally keep it empty if they're concerned with safety?

flyinghamster 1 hour ago 0 replies      
My parents survived a head-on collision in a 1965 Corvair vs. an out-of-control Camaro, with minor injuries, thanks to the trunk being in front. The trunk was just about obliterated, but the passenger compartment was not compromised. Had they been in a Chevy Vega, Ford Pinto, or other '60s/'70s-era small car, they would have likely been maimed or killed.

As far as I'm concerned, the second-generation Corvair was considerably safer than typical small cars of the era; the trunk was large, the gas tank was behind the front axle, and the first-generation Corvair's swing axle rear suspension was superseded by a fully-independent design.

I'd feel safer even in a first-generation Corvair than in a classic VW Beetle. F*ck Nader.

lostmsu 6 hours ago 1 reply      
Is it here because of that Tesla hipe? Tens of cars per year probably crumple just like this Model S, but there are (rightfully) no articles on HN or other IT web sites about that.
userbinator 18 hours ago 2 replies      
I find it more amazing the side windows are still intact, and the roof is also amazingly clean for a vehicle that is said to have rolled over "at least once" in a field.
samch 13 hours ago 2 replies      
Having been in a couple of rough accidents in my life, I thought it was pretty stunning that the frame of the passenger compartment was durable enough that the occupants were simply able to open the doors to get out.From what I've been witness to, often the frame and doors bend enough in a big wreck to make it difficult to exit the vehicle.The photos showing the front two doors open with all of the glass intact are quite impressive.
frik 8 hours ago 1 reply      
More interesting would be a crash against a tree or a concrete pillar - then one could argue if the crumble zone of a Tesla S is better compared to a traditional superior class Volvo or Mercedes with a front motor. Ignoring the videos of certificated crash test, the Tesla S doesn't look that special: https://www.youtube.com/results?q=tesla%20crash and even in certificated tests it isn't in the top range: http://www.euroncap.com
aidos 17 hours ago 1 reply      
It's with some irony that the ads I see on that page are for Vauxhall.

I can't zoom in because I'm on a mobile but the final picture looks like it has the vehicle off in the distance. If so, I can't believe how far off the road it is - it must have really been moving at the time. Does it have telemetry data they can use post accident to figure out speeds and things?

roflchoppa 8 hours ago 2 replies      
Yeah every time I get into my car from 1972 I think, " I'd o hit something this steering rack is going into my face"
lancefisher 16 hours ago 0 replies      
It looks like it flipped end over end with the front taking most of the impact, then the rear being crushed on the flip. The cabin must be extremely solid since the windows are still intact.
benaston 4 hours ago 0 replies      
This is not journalism. It is a (likely paid for) ad for Tesla.
Overtonwindow 18 hours ago 2 replies      
My 1998 Escort Zx2 had a gap between the bumper and the engine of about a foot that came in handy during a read ending. I think crumble zones of empty plastic are a great idea, and wish they'd install more of those on cars. It dissipates energy and potentially saves the car from what would otherwise be a car-destroying accident.
james_pm 18 hours ago 4 replies      
Crumple zones are all good, but why would you put that kind of power in the hands of an inexperienced and risk-happy 18-year-old driver? Looks like this was a P model (red calipers) which also means it has Ludicrous mode.

Perhaps Tesla could offer some innovation in the form of a de-tuned mode that triggers in the absence of a certain set of keys or pin entry.

chx 17 hours ago 1 reply      
Crumple zones even in much crappier cars are a blessing. My brother was once not given the right of way while he was driving a VW Polo (OK, Skoda Fabia, same car). The front of the car after looked like that Tesla but he could kick the door open (it was a little stuck) and get out. It's literally life and death. If Tesla is even better, I hope the next time a family member gets in an accident it'll be in a Tesla. That sentence probably sells cars.
ck2 4 hours ago 1 reply      
Yes, give an 18 year old a car that can accelerate that fast.

Tesla should make a "teenage driving mode" the parent can set.

So if there had been a huge heavy engine in the front, they probably would have been eating it with a front impact that hard?

rtnyftxx 16 hours ago 2 replies      
the car has so much software on board but isnt it personalized so that when my daughter is driving the car she cant get faster that e.g. 30mph?Or would this be to obvious?
dghughes 17 hours ago 0 replies      
The frunk fell off.


coin 16 hours ago 0 replies      
-1 for disabling zoom on mobile devices
tyre 14 hours ago 1 reply      
More importantly, this crash shows the importance of getting humans, especially young "adults", out from behind the wheel of a 5000 vehicle.
Friedduck 17 hours ago 0 replies      
Skip the article. Domain redirected me to a scam site (recognized I'm on Verizon, so it may not redirect everyone.)
Facebook sued for storing biometric data mined from photographs cnet.com
204 points by huntermeyer  23 hours ago   125 comments top 17
mosquito242 19 hours ago 10 replies      
I had the strangest interaction with the Messenger app a few months ago.

I was spending time with friends, and I took a few pictures of all of us (didn't send them through either FB or Messenger). A few hours later, messenger popped up a notification telling me something along the lines of "Hey, I see you took pictures today of <friend>. Want to send them to her?"

Made me feel incredibly creeped out that FB would take my photos and (presumably upload and) analyze them even when I hadn't given them to FB.

TazeTSchnitzel 22 hours ago 3 replies      
Facial recognition is really scary. This was recently demonstrated for Russia's Facebook, VKontakte, when a service appeared that let you look up people on that site by photo. So people started looking up the profiles of random subway passengers, outing sex workers to their families and friends, etc.
Esau 21 hours ago 1 reply      
"Facebook hoped to get the case thrown out on the grounds that its user agreement states any disputes should be governed solely by California law".

What about those people whose biometric data is being stored but don't have accounts of Facebook?

gcr 21 hours ago 1 reply      
To be clear: the article claims that Facebook can recognize anyone just by seeing them. They haven't demonstrated this ability.

It's far easier to judge who's in your picture among your 20 closest friends than it is to find that face among all two billion Facebook users.

State of the art recognition systems still have a few orders of magnitude of accuracy improvements to go before they can solve that problem.

See the Iarpa Janus project for some recent government+academic-sponsored work on this front: http://www.nist.gov/itl/iad/ig/facechallenges.cfm The task is to recognize terrorists in airport surveillance pictures and such.

superobserver 17 hours ago 1 reply      
Not merely will it store biometric data, if you happen to use their Messenger to share photos, those photos will remain stored on their servers even after you've deleted the associated conversation. Facebook's disregard for privacy and content control for its users is the biggest problem with its platform by far. And with its latest profit reports, it's bound to get worse.
gcr 21 hours ago 3 replies      
If you don't want your face to be recognized, you can sometimes prevent it from being detected in the first place.

Our research group recently looked into how to hide from Facebook's face detector. If you're uploading photos, adding white bars around the eyes of the subjects in the photo is the best way to prevent Facebook from finding the face. However, if you're out and about in the real world, even scarves and masks aren't enough--facebook sometimes finds these occluded faces too. There aren't very many easy answers. https://arxiv.org/abs/1602.04504

AndrewKemendo 19 hours ago 1 reply      
I think technologists need to help the public understand how if they are going to continue to fully integrate technology in assisting with their lives, they will be moving closer to a "privacy free" future.

The trade-off between having tailored services and "smart" systems is privacy from machine systems.

The technology community pines for futuristic technology like amazing machine personal assistants, forgetting that human personal assistants (see: professional executive secretaries) know basically everything about the person they are assisting.

KhalilK 21 hours ago 4 replies      
I attended a talk by Stallman last year and he implored anyone who took a picture of him not to post it on Facebook. Now I can see why.
FreedomToCreate 19 hours ago 1 reply      
Facebook was great when it was a way to connect with close friends and people in your immediate community (ex. university) and see what to happening around you. Now thats it trying to evolve into this platform that targets you so that it can make an insane amount of ad revenue, it has become less and less worth it. Especially in respect to our privacy.
visarga 19 hours ago 1 reply      
Maybe photos should carry a "robots.txt"-like permissions tag explicitly banning social networks from using it, and a cryptographic watermark embedded in the picture that is hard to remove, so they are not tempted to delete the tag.Cameras (and apps) should auto-tag files if the user so wishes.
crisisactor 20 hours ago 1 reply      
To be honest, most of these are surface level traits of an individual. There are deeper traits which go much more personal, and have even been touched on in popular culture in recent times, like in the new James Bond movie (think gait recognition). But even gait, although highly individual, is still touching the surface. I was thinking of 'trimming the bloom filter' to such a degree that we can recognize not only a person on sight, but by cue words, and individual dictionaries alone.

It is no secret that our world is divided by language alone, so then as analysts we can attach certain words to certain behaviors, and this has been proven countless times to expose a person. If I speak English I probably respond in the same way to 'pizza'. Pizza means food, and therefore pizza emotes a pleasure response. But 'bomb' and other words must decide a different response then?

Marketers have copped this early on and frequently use talismanic phrases to elicit positive responses to products, so why not Facebook, and any other tech harvester of data such as Google, et al? Last time I checked it is not a crime to elicit responses using personal, and individualized key-phrases.

breatheoften 11 hours ago 0 replies      
Photos permission access is too granular on iOS -- they should add a third access state for photos that grants an app permission to "write new photos to photo album, but prevents an app from scraping the photos library" ... Or maybe that should be the behavior of the current permission grant ...
fiatjaf 19 hours ago 2 replies      
Stop fighting it, just stop using it.
huevosabio 16 hours ago 0 replies      
Assuming that the concentration and mining of huge personal datasets is inevitable, can we design systems (other than law) that prevent these data to be misused?
Mendenhall 19 hours ago 0 replies      
One of the exact reasons I never used the data mining center known as facebook, creepy. The things You can do with such data is legion.
powera 19 hours ago 0 replies      
Humans have evolved for millions of years on the expectation that other people will recognize our faces. I'm sure we'll come up with a better solution than "file lawsuits to hopefully make computers recognizing our faces illegal".
Jill_the_Pill 20 hours ago 1 reply      
Can you obfuscate your "faceprint" by tagging a few other people as you?
FBI Harassing Core Tor Developer, Demand Meeting, but Refusing to Explain Why techdirt.com
239 points by xkiwi  2 days ago   119 comments top 19
mcherm 2 days ago 4 replies      
It sounds like her concern is that they might show her a magic piece of paper which, once you've seen it, requires you to do certain things and prohibits you from ever talking about it to anyone.[1]

Magic pieces of paper like that really shouldn't be a part of our legal system.

[1] https://epic.org/privacy/nsl/

cptskippy 2 days ago 1 reply      
She lost my support at vegan gluten-free brownies. She's clearly a monster.

But seriously, this really stinks. If talking to the FBI wasn't going to have negative consequences for her then why would they choose to approach her like that?

The whole attitude by law enforcement that "anyone who doesn't 100% cooperate with us on our terms is an enemy and should be treated as such" really doesn't foster cooperation but it does foster fear and resentment. I think the whole Apple/iPhone debacle demonstrates that perfectly. It's gotten to the point that businesses are finding that they're in a better position if they lock themselves out of their own data and tell law enforcement to fk off because their hands are tied. It's ironic because this non-cooperative behavior is a direct result of the abusive and hostile tactics law enforcement use against everyone.

warmblood 2 days ago 2 replies      
This is terrorism. Given what we all know based on past events about how the FBI conducts their activities, there is no way any reasonably aware citizen can conduct their life normally after such an encounter.

The last discussion thread on this topic had more than a few people complaining that Isis (the given name of the developer in the article) is overreacting and paranoid, which is a saddening response to see. It exposes the privileges and unfortunate circumstances citizens find themselves in because these agencies refuse to prosecute their anti-terror investigations in well-thought out ways, instead pursuing facile leads without regard to the external effects they cause.

We've even seen evidence that these agencies deliberately manipulate otherwise innocent people into behavior that implicates them in their "terror suspect" criteria, so it's hard to believe that anyone in this situation could be somehow too cautious.

grecy 2 days ago 1 reply      
>But if we happen to run into her on the street, were gonna be asking her some questions without you present.

Impressive for a LEO to state out loud he doesn't care about rights and due process. Like maybe he thinks he's above the law.

0091810911 2 days ago 1 reply      
"I got absolutely no work done."

That's the tactics they use (don't ask how I know that). Be strong and welcome to Germany!

appleflaxen 2 days ago 0 replies      
I wish the FBI agents who perform these kinds of actions were as introspective as Snowden. Once you define your team as "the good guys", you can do all kinds of morally objectionable things to the other side.

"the ends justify the means" at that point

celticninja 2 days ago 0 replies      
The worst thing about this is that it will turn out that the developers involvement will be relatively innocuous in the grand scheme of things, hence the lack of any real urgency from the FBI, but it is terrifying for both the developer and as a citizen that he FBI seem to be inept and/or deliberately making it so difficult to speak with someone who it would appear they think can assist them.
zekevermillion 2 days ago 0 replies      
The problem is the attitude that the NSL is just another "tool" in the FBI's toolbox, to be used as aggressively as can be in the pursuit of criminals. There is no effective check or balance against this -- we are forced to rely on the investigators to use their discretion, knowing that any challenge to such a warrantless request has about a zero % chance of being successful before the FISA court.

I would like to see an activist attempt to challenge such an order, or even refuse to cooperate in an NSL where the FBI is acting outside of its legislative and Constitutional authority. However, the overwhelming incentive in any such situation is to cooperate. And from what I have read, even if the FBI is acting illegally, the subject of the investigation aided by the NSL may still be a fairly loathsome criminal -- so, you have to be quite a principled activist to risk prison time to make a civil rights statement, when this is directly going to benefit a badguy in the specific instance.

tzaman 2 days ago 1 reply      
Why does everyone, including Ms. Lovecruft conclude that something is horribly wrong and that FBI wanting to speak with them means they are in deep shit? I'm by no means defending them, but to me, the whole (original) post looks overly paranoid.

I'm not an American, you can speak with Police freely here in Slovenia, so maybe I understand things wrong?

deadtofu 2 days ago 1 reply      
Could they be trying to serve her?
golergka 2 days ago 4 replies      
The original story told about a couple of polite phonecalls where they have requested to talk with her. How does it get described as "harassment"? Of course, she also added a lot of paranoid speculation to her original blog post, but there's not a single confirmation of her fears so far, not even a hint.
hackuser 2 days ago 0 replies      
1) Imagine the position of people who can't afford a lawyer, don't work for the EFF and know their rights, can't easily move to anothher country, and don't have a platform from which to publicly tell their story.

2) Imagine this power in the hands of a President with fascistic tendancies.

jeffdavis 2 days ago 0 replies      
This didn't really register with me as tyrrany. What did the FBI actually do to her?

Going around a lawyer is a good way to lose a case against her, so maybe they weren't trying to make a case against her.

Maybe they actually had some real questions that involved her expertise or knowledge she might have had.

tobltobs 2 days ago 0 replies      
This is like the SA treated people at the beginning. They knew they did not have any legal foundation for their harrasments, but that they would never hold accountable for that. After some time without uprising against that the SA was replaced by the SS.
nxzero 2 days ago 2 replies      
Curious how many people on HN have had contact with the FBI in the past year?
homero 2 days ago 0 replies      
It's plausible it's because of her name Isis and some agent thinks she's Isis
dang 1 day ago 1 reply      
> I know you consumed them at a really impressionable age, and it was really exciting to think about, but

Please keep this sort of personal snark out of HN comments. It's uncivil and unsubstantive.



draw_down 2 days ago 0 replies      
Sounds like they have no god damn idea what they're doing. But of course, frightening nonetheless.
'Boaty McBoatface' polar ship named after David Attenborough bbc.co.uk
214 points by sghi  2 days ago   247 comments top 36
JorgeGT 2 days ago 11 replies      
Good luck selling plushies, apparel, comics or animated cartoons of lil' fun Sir David Attenborough! Now instead of having the dream name for science outreach and awareness for kids, you have a stark, cynical lesson on democracy.
samsolomon 2 days ago 1 reply      
I don't know. It feels like a missed opportunity to excite kids about research.

Can't you just see the series of children's books about Boaty McBoatface and all the incredible science adventures it hasa Magic School Bus 2.0?

Many people taking themselves too seriously. It's a shame.

sschueller 2 days ago 2 replies      
Sir David Attenborough should have his legal name (or at least his twitter handle :)) changed to 'Sir Boaty McBoatface'. Then see what happens...
planetjones 2 days ago 4 replies      
I think the name chosen is a good one and like David Attenborough himself, the boat's name will endure for a long time.

However, what did the organisation (NERC) expect when they gave a free choice for the names of the boat. They would have been far better having a poll of say 5 names which had been pre-selected.

alblue 2 days ago 5 replies      
The remotely operated vehicle has been named BoatyMcBoatface instead:


_Marak_ 2 days ago 0 replies      
I'm disappointed.

If an organization decides to use a social media contest like this to promote themselves, they should have a social responsibility to adhere to the demands of the people.

"Boaty McBoatface" was a good name.

rectang 2 days ago 1 reply      
Disappointing. It's as if UC Santa Cruz had chosen "Sea Lions" over "Banana Slugs".


pjc50 2 days ago 0 replies      
I think people are underestimating the popularity of David Attenborough - to many people in the UK he is the face, and especially voice, of TV naturalism, and has been for decades.
simonh 2 days ago 0 replies      
Naming one of the remote drones 'Boaty' heads off people calling the ship 'Boaty' as a nickname. I wonder if that's why they did it.
siddboots 2 days ago 3 replies      
Good choice, and also a great way for the organisation to get out of a tough PR situation. Literally no-one dislikes David Attenborough.
whyleyc 2 days ago 2 replies      
It's almost as if representative democracy of this kind is a complete sham. It's a good job this kind of ridiculous situation doesn't crop up here on HN from our YC overlords !

Wait, what. [1]

[1] https://news.ycombinator.com/item?id=11633517

dzdt 2 days ago 0 replies      
Being named Boaty McBoatface would have been such a great icebreaker for starting outreach discussions. A ship like this one needs a great icebreaker sometimes.
sbmassey 2 days ago 0 replies      
Goofy names are usually fun for about 5 minutes, but quickly become seriously annoying if you have to use them continually. I imagine the same is true for ship names as for naming classes or functions.
sccxy 2 days ago 0 replies      
Good outcome from this naming campaign

>We're building on the #BoatyMcBoatface spirit with a 1m Polar Explorer programme to inspire the next generation of scientists and explorers


genmon 2 days ago 0 replies      
They should have called it Pinboard
daimyoyo 2 days ago 0 replies      
If they wanted to name the boat after someone like Attenborough that's great but why open the vote to the public and then ignore the most popular choice?
f4stjack 2 days ago 0 replies      
They can name it whatever they want, that ship has been recorded as "Boaty McBoatface" to my memory.

Also really kudos to them, Boaty McBoatface made everyone smile when they heard the name and got them interested; Sir David Attenborough however... doesn't make that effect.

smoyer 2 days ago 0 replies      
This just in ...

In a surprise move, Sir David Attenborough legally changes his name to Boaty McBoatface in support of the Internet hordes. Facing waning popularity, this maneuver is seen as a means of returning to the cult-icon status days he experienced as a broadcaster. When asked for comment, Sir McBoatface stated "I'm a few days short of my ninetieth birthday and let's face it - once I'm gone nobody's going to remember a name like Attenborough. As Boaty McBoatface, my name will actually be remembered. Would you like fries with that?"

goffley3 2 days ago 1 reply      
Is it too much just let the internet have its candy? Just this once, when it's completely innocuous.
swamp40 1 day ago 0 replies      
If voting made any difference they wouldn't let us do it. ~ Mark Twain (apocryphal)
sgnelson 1 day ago 0 replies      
If nothing else, I must say that reading through this thread, I now understand much better why I've always thought the world can be such a sad place; There are way too many people who apparently have no sense of humor.
TheCraiggers 2 days ago 0 replies      
When will they learn that holding an online poll to name stuff is never, ever a good idea. They should count themselves lucky the winning entry was work-safe.

Although I still feel that if you're going to go through with it you should stick with the results. Silly people.

transfire 2 days ago 0 replies      
And lo it is proven... the government does not listen to the people.
delecti 1 day ago 0 replies      
"Boaty McBoatface" was just irreverent enough without being offensive that it should have been a perfect choice to gain a bunch of free publicity.
jonathankoren 1 day ago 0 replies      
Am I just wrong in thinking that it's a relatively new phenomenon to name things after living people?
jordanb 1 day ago 0 replies      
The cute name wasn't suitably distinguished so instead they named it after a television personality..
quocble 1 day ago 0 replies      
Everyone: let's go with the most boring name possible. it's who we are!
unlinker 1 day ago 0 replies      
For some time, iirc, it was about to be called Blas de Lezo.
mherdeg 2 days ago 0 replies      
Wow! Big year for Leicester!
quadyeast 1 day ago 0 replies      
This always cracks me up and I don't know why.
mtgx 2 days ago 0 replies      
And the boat was never to be heard from again (in the media).
pyython 2 days ago 0 replies      
No justice in this world.
pessimizer 1 day ago 1 reply      
Thank God that this boat was named after a television host rather than a whimsy chosen by the public. I guess "seriousness" was a choice between terminally ill children and entertainers. Couldn't find a scientist?
randac 2 days ago 2 replies      
That's great... but meanwhile, he's being downvoted for doing exactly the same as you; sharing his opinion.

Stay classy, HN.

ck2 2 days ago 0 replies      
We need to quit naming things after people in the first place, it's always egotistical and political.
x5n1 2 days ago 2 replies      
They should have just gone with it, and said the people have spoken. It would have been good for morale, because everyone would have a smile on their face every time they said the name in a British accent.
OS X app in plain C github.com
248 points by dmytroi  2 days ago   150 comments top 22
btrask 2 days ago 10 replies      
There was a thread the other day where people were talking about languages that were easily interoperable with the "C ABI" (although there is no such thing). Languages listed included C++, Rust, and maybe a few others--but no mention of Objective-C.

The ability to use a high-level, dynamic language (ObjC), C, and even inline assembly in a single source file is unique to Objective-C (at least among the "mainstream" languages), and something I think is often under-appreciated.

One comment on the code here: for message calls returning type 'id' (object type), you don't need to cast the dispatch function. It would make the code much more readable. For other return types, you need a cast, but I'd wrap it in a macro. You could even use a C11 generic macro to handle floating point return values (unfortunately necessary).

justsaysmthng 2 days ago 5 replies      
Don't do this at home. It looks so ugly because it's dangerous and vice-versa.

If you really want to write (and read) code like this:

id titleString = ((id ()(id, SEL, const char))objc_msgSend)((id)objc_getClass("NSString"), sel_registerName("stringWithUTF8String:"), "sup from C");

((void (*)(id, SEL, id))objc_msgSend)(window, sel_registerName("setTitle:"), titleString);

instead of this

[window setTitle:@"sup"];

then I guess it would be easier to write a Objective-C to C converter (if there isn't one already) and convert your Objective-C app into C for added coolness. I

0x0 2 days ago 0 replies      
santaclaus 2 days ago 2 replies      
> For some reason if we run the app from command line than menu is not accesible by mouse at first time

This happens with Qt apps on OS X, as well, and it drives me bonkers. If you launch from the command line you have to command tab away and back for the menu to work

bdrool 2 days ago 0 replies      
This is what Objective-C looks like if you "unroll" it to plain C. Not too surprisingly, it's very verbose. Other than as an exercise, there's no reason to do this, particularly given that Objective-C is a strict superset of C.[1]

[1] http://stackoverflow.com/questions/19366134/what-does-object...

Hydraulix989 2 days ago 3 replies      
It's amazing and sad for computing that somehow managing to use a sane language to develop software for a very commonly used platform is seen as a serious feat of accomplishment.
sgt 2 days ago 0 replies      
If you compile it, you'll see that the ObjC executable with no ARC and the C-only executable are exactly the same size stripped. The ObjC version with ARC is 392 bytes bigger.
conradev 2 days ago 0 replies      
And introduced in OS X 10.10 is the ability to build an OS X app in pure JavaScript:


(Atwood's Law in action)

userbinator 2 days ago 2 replies      
As someone who has done a lot of Win32 programming in plain C, I can recognise some common patterns like an event loop and window creation, but it's quite amazing how much extra "cruft" there is just to deal with what appears to be the contortions of OOP'ing everything. All those string constants are rather surprising too. The interesting thing is, the few times I've had to use a Mac the GUI didn't feel quite as responsive as Windows on the same hardware --- it wasn't jerky/stuttering, just what I'd describe as "smooth but sluggish", and the binaries were noticeably larger, and I now wonder if this increased abstraction and complexity has anything to do with it.

For comparison, a similar Win32 app in plain C:


X Windows app in plain C:


...and just for fun, the same simple Win32 app from above in Assembly language:


Looking at the code again, if I were forced to write an OS X app in plain C, there would be plenty of macro usage around objc_msgSend.

mcmatterson 2 days ago 0 replies      
A few years ago I dove into the idea of trying to make a C only iOS app (well actually, it was a quick exploration of how hard it would be to build your own UI kit without using UIKit). The dive bottomed out pretty quickly, as I couldn't figure out a way to access windows without resorting to UIWindow methods, and `UIGraphicsGetCurrentContext()` acted super wonky calling it without 'proper' bootstrapping around it. I was expecting more, given the usual layered approach that Apple takes with their system libraries.
mayoff 2 days ago 0 replies      
Richard J. Ross III did it a few years ago for iOS:


eyeareque 2 days ago 1 reply      
I'm not a developer, but does it count when they use objc header files?

#include <objc/objc.h>#include <objc/runtime.h>#include <objc/message.h>#include <objc/NSObjCRuntime.h>

tosseraccount 2 days ago 0 replies      
As a side note:http://stackoverflow.com/questions/525609/use-c-with-cocoa-i...

A commenter at stackoverflow provides some code that "shows how to access Cocoa GUI from pure C/C++ and build a truly functional GUI application"

He notes that you must "link against Cocoa framework".

sdegutis 2 days ago 1 reply      
My interest in this comes from wanting to build Objective-C based OSX-UIs using scripting languages that are only embeddable in C, like Lua. When (if?) that day comes, I'll be like a kid in a candy store all over again.
icodestuff 2 days ago 3 replies      
I don't buy it. Yes, the language may be C, but if you're linking the ObjC runtime and frameworks, all you're really avoiding here is the nib-loading machinery.

To be fair, I think that's a perfectly reasonable goal in and of itself - showing how to set up an OS X app "from scratch" is definitely something I'm interested in - but we shouldn't pretend this isn't strongly relying on Objective-C to actually get things done.

EDIT: indeed, the makefile copies main.c to main.m before invoking clang, which means it's using the Objective-C compiler.

What would be much more impressive is if this used only the C Core* frameworks (Core Foundation and Core Graphics in particular) to do the same thing.

justinlardinois 2 days ago 3 replies      
I'm sure the Windows version would be just as bad. Most of the useful parts of the Windows API don't have a C interface, so you have to use COM, which essentially means editing vtable-like structures by hand.


pps43 2 days ago 0 replies      
That's a lot of boiler plate code. Is there a way to make it smaller? For example, if all I need is to open and process a file with no GUI (maybe a progress bar).
whitehat2k9 2 days ago 0 replies      
Well, that looks rather unpleasant.
beamatronic 2 days ago 1 reply      
Outstanding! I have been wanting to see this for a long time. I really want to build OS X apps but I personally prefer to build GUIs programatically.
chris_wot 2 days ago 0 replies      
Would love to see this annotated.
sigjuice 2 days ago 2 replies      
I don't buy the "plain C" claim, given the gratuitous use of objc_msgSend().
jheriko 2 days ago 1 reply      
sorry, but mediocre at best, full of mistakes at worst.

i'm sure it was a good exercise in learning objective-c but this is not an amazing achievement. i'd expect much better from someone competent reading the god awful docs and reverse engineering by inspection.

don't give up though. doing things like this over and over, especially in the face of nay saying arses like me, is what makes great programmers. :)

       cached 8 May 2016 15:11:01 GMT