hacker news with inline top comments    .. more ..    21 Apr 2014 Best
home   ask   best   4 years ago   
Lens Blur in the new Google Camera app googleresearch.blogspot.com
604 points by cleverjake  4 days ago   238 comments top 56
jawns 4 days ago 5 replies      
Regarding the technology (achieving shallow depth of field through an algorithm), not Google's specific implementation ...

Up until now, a decently shallow depth of field was pretty much only achievable in DSLR cameras (and compacts with sufficiently large sensor sizes, which typically cost as much as a DSLR). You can simulate it in Photoshop, but generally it takes a lot of work and the results aren't great. The "shallow depth of field" effect was one of the primary reasons why I bought a DSLR. (Yeah, yeah, yeah, quality of the lens and sensor are important too.) Being able to achieve a passable blur effect, even if it's imperfect, on a cellphone camera is really pretty awesome, considering the convenience factor. And if you wanted to be able to change the focus after you take the picture, you had to get a Lytro light field camera -- again, as expensive as a DSLR, but with a more limited feature set.

Regarding Google's specific implementation ...

I've got a Samsung Galaxy S4 Zoom, which hasn't yet gotten the Android 4.4 update, so I can't use the app itself to evaluate the Lens Blur feature, but based on the examples in the blog post, it's pretty good. It's clearly not indistinguishable from optical shallow depth of field, but it's not so bad that it's glaring. That you can adjust the focus after you shoot is icing on the cake, but tremendously delicious icing. The S4 Zoom is a really terrific point-and-shoot that happens to have a phone, so I'm excited to try it out. Even if I can use it in just 50% of the cases where I now lean on my DLSR, it'll save me from having to lug a bulky camera around AND be easier to share over wifi/data.

grecy 4 days ago 9 replies      
We had an interesting discussion about this a few nights ago at a Photojournalism talk.

In that field, digital edits are seriously banned, to the point multiple very well known photo journalists have been fired for one little use of the clone tool [1] and other minor edits.

It's interesting to think I can throw an f/1.8 lens on my DSLR and take a very shallow depth of field photo, which is OK, even though it's not very representative of what my eyes saw. If I take the photo at f/18 then use an app like the one linked, producing extremely similar results, that's banned. Fascinating what's allowed and what's not.

I find even more interesting is the allowance of changing color photos to B/W, or that almost anything that "came straight off the camera" no matter how far it strays from what your eyes saw.

[1] http://www.toledoblade.com/frontpage/2007/04/15/A-basic-rule...

DangerousPie 4 days ago 4 replies      
Isn't this just a copy of Nokia's Refocus?


edit - better link: http://www.engadget.com/2014/03/14/nokia-refocus-camera-app-...

salimmadjd 4 days ago 3 replies      
Is the app taking more than one photo? It wasn't clear in the blog post. AFAIU to have any depth perception you need to take more than one photo. Calculate the pupil distance (the distance the phone moved) then match image features between the two or more images. Calculate the amount of movement between the matching features to then calculate the depth.

As described you then map the depth into an alpha transparency and then apply the blurred image with various blur strength over the original image.

Since you're able to apply the blur after the image, it would mean the google camera always takes more than one photo.

Also a Cool feature would be to animate the transition from no blur to DOF blur as a short clip or use the depth perception to apply different effect than just blur, like selective coloring, or other filters.

dperfect 4 days ago 5 replies      
I believe the algorithm could be improved by applying the blur to certain areas/depths of the image without including pixels from very distant depths, and instead blurring/feathering edges with an alpha channel over those distant (large depth separation) pixels.

For example, if you look at the left example photo by Rachel Been[1], the hair is blurred together with the distant tree details. If instead the algorithm detected the large depth separation there and applied the foreground blur edge against an alpha mask, I believe the results would look a lot more natural.

[1] http://4.bp.blogspot.com/-bZJNDZGLS_U/U03bQE2VzKI/AAAAAAAAAR...

nostromo 4 days ago 6 replies      
I sure wish you could buy a DSLR that just plugs into your iPhone. I don't want any of that terrible DSLR software -- just the hardware.

I think many devices should become BYOD (bring your own device) soon, including big things like cars.

edit: I don't just want my pictures to be saved on my phone. I'd like the phone to have full control of the camera's features -- so I can use apps (like timelapse, hdr, etc.) directly within the camera.

panrafal 2 hours ago 0 replies      
I've created a parallax viewer for lens blur photos. It's an open source web app available at http://depthy.stamina.pl/ . It lets you extract the depthmap, works on chrome with webgl and looks pretty awesome on some photos.There is quite a few things you can do with this kind of images, so feel free to play around with the source code on github https://github.com/panrafal/depthy
themgt 4 days ago 5 replies      
Is looking at the examples giving anyone else a headache? It's like the software blur falls into some kind of uncanny valley for reality.
kbrower 4 days ago 1 reply      
I did a quick comparison of a full frame slr vs moto x with this lens blur effect. I tried to match the blur amount, but made no other adjustments. Work really well compared to everything else I have seen!http://onionpants.s3.amazonaws.com/IMG_0455.jpg
fidotron 4 days ago 0 replies      
Doesn't look totally convincing, but it's good for a first version.

The real problem with things like this is the effect became cool by virtue of the fact it needed dedicated equipment. Take that away and the desire people will have to apply the effect will be greatly diminished.

Spittie 4 days ago 2 replies      
I find it funny that this was one of the "exclusive features" of the HTC One M8 thanks to the double camera, and days after it's release Google is giving the same ability to every Android phones.

I'm sure the HTC implementation works better, but this is still impressive.

nileshtrivedi 4 days ago 3 replies      
With these algorithms, will it become feasible to make a driverless car that doesn't need a LIDAR and can run with just a few cameras?

Currently, the cost of LIDARs are prohibitive to make (or even experiment with) a DIY self-driving car.

sytelus 4 days ago 2 replies      
Wow.. this is missing the entire point on why lens blur occurs. Lens blur in normal photographs is the price you pay because you want to focus sharply on a subject. The reason photos with blur looks "cool" is not because the blur itself but its because the subject is so sharply focused that its details are order of magnitude better. If you take a random photo, calculate depth map somehow, blur our everything but the subject then you are taking away information from the photo without adding information to the subject. The photos would look "odd" to the trained eyes at best. For casual photograph, it may look slightly cool on small screens like phone because of relatively increased perceived focus on subject but it's fooling eyes of casual person. If they want to really do it (i.e. add more details to subject) then they should use multiple frames to increase resolution of the photograph. There is a lot of research being done on that. Subtracting details from background without adding details to subject is like doing an Instagram. It may be cool to teens but professional photographers know it's a bad taste.
scep12 4 days ago 2 replies      
Impressive feat. Took a few snaps on my Nexus 4 and it seems to work really well given a decent scene.
jnevelson 4 days ago 1 reply      
So Google basically took what Lytro has been using hardware to achieve, and did it entirely in software. Pretty impressive.
frenchman_in_ny 4 days ago 2 replies      
Does this pretty much blow Lytro out of the water, and mean that you no longer need dedicated hardware to do this?
anigbrowl 4 days ago 0 replies      
It's interesting that the DoF is calculated in the app. I am wondering if this uses some known coefficients about smartphone cameras to save computation, but in any case I hope this depth mapping becomes available in plugin forms for Photoshop and other users.

As an indie filmmaker, it would save a lot of hassle to be able to shoot at infinity focus all the time and apply bokeh afterwards; of course an algorithmic version would likely never get close to what you can achieve with quality optics, but many situations where image quality is 'good enough' for artistic purposes (eg shooting with a video-capable DSLR) then faster is better.

angusb 4 days ago 0 replies      
A couple of other really cool depth-map implementations:

1) The Seene app (iOS app store, free), which creates a depth map and a pseudo-3d model of an environment from a "sweep" of images similar to the image acquisition in the article

2) Google Maps Photo Tours feature (available in areas where lots of touristy photos are taken). This does basically the same as the above but using crowdsourced images from the public.

IMO the latter is the most impressive depth-mapping feat I've seen: the source images are amateur photography from the general public, so they are randomly oriented (and without any gyroscope orientation data!), and uncalibrated for things like exposure, white balance, etc. Seems pretty amazing that Google have managed to make depth maps from that image set.

Splendor 4 days ago 0 replies      
Isn't the real story here that Google is continuing to break off core pieces of AOSP and offer them directly via the Play Store?
tdicola 4 days ago 0 replies      
Neat effect--I'm definitely interested in trying this app. Would be cool to see them go further and try to turn highlights in the out of focus areas into nice octagons or other shapes caused by the the aperature blades in a real camera.
gamesurgeon 4 days ago 2 replies      
One of the greatest features is the ability to change your focus point AFTER you shoot. This is huge.
kingnight 4 days ago 1 reply      
I'd like to see an example of a evening/night shot using this. I can't imagine the results are anything like the examples here, but would love to be surprised.

Are there more samples somewhere?

mauricesvay 4 days ago 0 replies      
The interesting part is not that it can blur a part of the image. The interesting part is that it can generate a depth map automatically from a series of images taken from different points of view, using techniques used in photogrammetry.
goatslacker 4 days ago 0 replies      
On iOS you can customize your DoF with an app called Big Lens.

Normally apps like Instagram and Fotor let you pick one point in the picture or a vertical/horizontal segment and apply focus there while blurring the background. Big Lens is more advanced since it lets you draw with your finger what you'd like to be in focus.

They also include various apertures you can set (as low as f/1.8) as well as some filters -- although I personally find the filters to be overdone but others might find them tasteful.

bckrasnow 4 days ago 1 reply      
Well, the Lytro guys are screwed now. They're selling a $400 camera with this feature as the main selling point.
spot 4 days ago 0 replies      
i just noticed i have the update and i tried it out.wow, first try. amazing:https://plus.google.com/+ScottDraves/posts/W4ozBLTBmKy
jestinjoy1 4 days ago 1 reply      
This is what i got with Moto G Google Camera Apphttp://i.imgur.com/a6AxO4e.jpg
defdac 4 days ago 0 replies      
Is this related to the point cloud generation feature modern compositing programs use, like Nuke? Example/tutorial video: http://vimeo.com/61463556 skip to 10:27 for magic
thenomad 3 days ago 0 replies      
So, is there a way to get the depth map out of the image separately for more post-processing?

Fake DOF is nice, but there are a lot more fun things you can use a depth map for. For example, it seems like ghetto photogrammetry (turning photographs into 3D objects) wouldn't be too far away.

Lutin 4 days ago 0 replies      
This app is now on the Play Store and works with most phones and tablets running Android 4.4 KitKat. Unfortunately it seems to crash on my S3 running CM 11, but your experience may vary.


insickness 4 days ago 1 reply      
> First, we pick out visual features in the scene and track them over time, across the series of images.

Does this mean it needs to take multiple shots for this to work?

mcescalante 4 days ago 3 replies      
I may be wrong because I don't know much about image based algorithms, but this seems to be a pretty successful new approach to achieving this effect. Are there any other existing "lens blur" or depth of field tricks that phone makers or apps are using?

I'd love to see their code open sourced.

zmmmmm 4 days ago 0 replies      
If nothing else, these improvements make HTC's gimmick of adding the extra lens while giving up OIS seem all the more silly.
techaddict009 4 days ago 0 replies      
Just installed it. Frankly speaking I loved the new app!
jheriko 4 days ago 0 replies      
This sounds clever but also massively complex for what it does. I don't have anything finished but I can think of a few approaches to this without needing to reconstruct 3d things with clever algorithms... still very neat visually if technically underwhelming
the_cat_kittles 4 days ago 1 reply      
Isn't it interesting how, by diminishing the overall information content of the image by blurring it, it actually communicates more (in some ways, particularly depth) to the viewer?
anoncow 4 days ago 1 reply      
How is Nokia Refocus similar or different to this? It allows refocusing a part of the image which blurs out the rest.(Not a pro) https://refocus.nokia.com/
guardian5x 4 days ago 1 reply      
I guess that is exactly the same as Nokias Refocus that is on the Lumia Phones for quite some time: https://refocus.nokia.com/
marko1985 4 days ago 0 replies      
Happy for this "invention" but I would wait for this kind of stuff when smartphones will have all their laser sensors for depth measurment, so this calculations doesn't require a sequnce of taken picture, as the main character could move quickly and deform the final picture or the blur effect. But for static photography or selfies looks amazing.
coin 4 days ago 0 replies      
Shallow depth of field is so overused these days. I much prefer having the entire frame in focus, and let me decides what to focus on. I understand the photographer is trying to emphasize certain parts of the photo, but in the end it feels too limiting. It's analogues to mobile "optimized" websites - just give me all the content and I'll choose what I want to look at.
CSDude 4 days ago 0 replies      
I wonder what is the exact reason that my country is not included. It is just a fricking camera app.
benmorris 4 days ago 0 replies      
The app is fast on my nexus 5. The lense blur feature is really neat. I've taken some pictures this evening and they have turned out great. Overall a nice improvement.
dharma1 4 days ago 0 replies      
the accurate depth map creation from 2 photos on a mobile device is impressive. The rest has been done many times before

This is cool, but I am waiting more for RAW images exposed in Android camera API. Will be awesome to do some cutting edge tonemapping on 12bits of dynamic range that the sensor gives, which is currently lost.

ohwp 4 days ago 0 replies      
Nice! Since they got a depth map, 3D-scanning can be a next step.
sivanmz 4 days ago 0 replies      
It's a cool gimmick that would be useful for Instagram photos of food. But selfies will still be distorted when taken up close with a wide angle lens.

It would be interesting to pair this with Nokia's high megapixel crop-zoom.

spyder 4 days ago 0 replies      
But it can be used only on static subjects because it needs series of frames for depth.
thomasfl 4 days ago 0 replies      
I wish google camera gets ported to iOS. The best alternative for iOS seems to bee the "Big Lens" app, where you have to manually create a mask to specify the focused area.
bitJericho 4 days ago 0 replies      
If you couple this with instagram does it break the cosmological fabric?
servowire 4 days ago 3 replies      
I'm no photographer, but I was tought this was called bokeh not blur. Blur is more because of motion during open shutter.
avaku 4 days ago 0 replies      
So glad I did the Coursera course on Probabilistic Graphical Models, so I totally have an understanding of how this is done when they mention Markov Random Field...
matthiasb 4 days ago 0 replies      
I don't see this mode. I have a Note 3 from Verizon. Do you?
DanielBMarkham 4 days ago 0 replies      
Lately I've been watching various TV shows that are using green screen/composite effects. At times, I felt there was some kind of weird DOF thing going on that just didn't look right.

Now I know what that is. Computational DOF. Interesting.

Along these lines, wasn't there a camera technology that came out last year that allowed total focus/DOF changes post-image-capture? It looked awesome, but IIRC, the tech was going to be several years until released.

ADD: Here it is. Would love to see this in stereo 4K: http://en.wikipedia.org/wiki/Lytro The nice thing about this tech is that in stereo, you should be able to eliminate the eyeball-focus strain that drives users crazy.

apunic 4 days ago 0 replies      
Game changer
alexnewman 4 days ago 0 replies      
Got me beat
seba_dos1 4 days ago 0 replies      
Looks exactly like "shallow" mode of BlessN900 app for Nokia N900 from few years ago.

It's funny to see how most of the "innovations" in mobile world presented today either by Apple or Google was already implemented on open or semi-open platforms like Openmoko or Maemo few years before. Most of them only as experiments, granted, but still shows what the community is capable of on its own when not putting unnecessary restrictions on it.

sib 4 days ago 2 replies      
If only they had not confused shallow depth of field with Bokeh (which is not the shallowness of the depth of field, but, rather, how out-of-focus areas are rendered), this writeup would have been much better.


Cool technology, though.

The Birth and Death of JavaScript [video] destroyallsoftware.com
584 points by gary_bernhardt  3 days ago   227 comments top 34
lelandbatey 3 days ago 2 replies      
First, I very much love the material of the talk, and the idea of Metal. It's fascinating, really makes me think about the future.

However, I also want to rave a bit about his presentation in general! That was very nicely delivered, for many reasons. His commitment to the story, of programming from the perspective in 2035, was excellent and in many cases subtle. His deadpan delivery really added to the humor; the fact that he didn't even smile during any of the moments when the audience was laughing just made it all the more engaging.

Fantastic talk, I totally loved it!

tinco 3 days ago 2 replies      
The reason why metal doesn't exist now is because you can't turn the memory protection stuff off in modern CPU's.

For some weird reason (I'm not an OS/CPU developer) switching to long mode on an x86 cpu also turns on the mmu stuff. You just can't have one without the other.

There's a whole bunch of research done on VM software managed operating systems, back when the VM's started becoming really good. Microsoft's Singularity OS was the hippest I think.[0]

Perhaps that ARM cpu's don't have this restriction, and we will benefit from ARM's upmarch sometime?

[0] http://research.microsoft.com/en-us/projects/singularity/

jerf 3 days ago 0 replies      
It's not far off my predictions: https://news.ycombinator.com/item?id=6923758

Though I'm far less funny about it.

vanderZwan 3 days ago 2 replies      
I guess this is in a way a response to Bret Victor's "The Future of Programming"?


spyder 3 days ago 0 replies      
Looks like Erlang is already getting one step closer to the metal:


Also there is another project that can be related to that goal:

"Our aim is to remove the bloated layer that sits between hardware and the running application, such as CouchDB or Node.js"


jongalloway2 3 days ago 0 replies      
Coincidentally, I just released a podcast interview with Gary right after he gave this talk at NDC London in December 2013: http://herdingcode.com/herding-code-189-gary-bernhardt-on-th...

It's an 18 minute interview, and the show notes are detailed and timestamped. I especially liked the references to the Singularity project.

cjbprime 3 days ago 2 replies      
For context, this was one of the most enjoyed talks at PyCon this year.
joelangeway 3 days ago 5 replies      
He says several times that JavaScript succeeded in spite of being a bad language because it was the only choice. How come we're not all writing Java applets or Flash apps?
nkozyra 3 days ago 0 replies      
Extraordinarily entertaining and well presented.
vorg 3 days ago 0 replies      
I suspect Nashorn, the just released edition of JavaScript for the JVM, will be heavily promoted by Oracle and become heavily used for quick and dirties manipulating and testing Java classes, putting a dent into use of Groovy and Xtend in Java shops. After all, people who learn and work in Java will want to learn JavaScript for the same sort of reasons.
igravious 3 days ago 0 replies      
Stellar stuff. Hugely enjoyable. Very interesting thought experiment. I won't spoil it for any of you, just go and watch! Mr. Bernhardt, you have outdone yourself sir :)
mgr86 3 days ago 8 replies      
I'm missing some obvious joke...but why is he pronouncing it yava-script.
atmosx 2 days ago 1 reply      
I have a question, because this video confused me. I don't have background to follow through all the assertions Gary Bernhardt did, but I'll try to watch it again, since it was fun.

I want to become a full stack developer. I can program and write tests in ruby, I can write applications using Sinatra and now I am learning rails. I bought a book to start learning JavaScript because it's the most popular language and basically will allow me to write modern applications. After I'm done with JS I'll probably jump into something else (rust, go, C, C++, Java, whatever helps do the staff I want).

But watching this video, I'm confused: I avoided CoffeScript because I read in their documentation that in order to debug the code you have to actually know JavaScript so I figured that the best thing to do is learn JS and then use an abstraction (i.e. Coffescript) and tools like AngularJS and Node.js... Is my approach wrong? :-/

dsparry 3 days ago 1 reply      
Very impressive to have been recorded "April 2014" and released "April 2013." Seriously, though, great presentation.
steveklabnik 3 days ago 1 reply      
Consider the relationship between Chromebooks and METAL.

(I'm typing this from my Pixel...)

Sivart13 3 days ago 1 reply      
Where did you get the footage of Epic Citadel used in the talk?

http://unrealengine.com/html5 seems to have been purged from the internet (possibly due to this year's UE4 announcements?) and I can't find any mirrors anywhere.

Which is a shame, because that demo was how I used to prove to people that asm.js and the like were a Real Thing.

leichtgewicht 2 days ago 4 replies      
I like that he mentions "integer". It is still very incredible how JavaScript can work well without a integer construct. Or threads and shared memory. Or bells and whistles.
base698 3 days ago 1 reply      
I wish some of those talks were available for purchase on their own and not in the season packets. Definitely a few I'd buy since I liked this talk and the demo on the site.

Guy has good vim skills for sure.

jr06 2 days ago 1 reply      
Video tl;dw:

Gary Bernhardt (rightly) says that JavaScript is shit (with some other insights).

HN comments tl;dr:

50%: "Waahhh, JavaScript is awesome and Node.js is wonderful, shut up Gary Bernhardt."

25%: Smug twats talking about how they're too busy changing the world with JavaScript to even bother to comment.

25%: Pedants and know-it-alls having sub-debates within sub-debates.

Pretty standard turnout. See you tomorrow.

angersock 3 days ago 0 replies      
It's been kind of fun watching JS developers reinventing good chunks of computer science and operating systems research while developing node.

This talk has convinced me that their next step will be attempting to reinvent computer engineering itself.

It's a pretty cool time to be alive.

alexandercrohde 2 days ago 0 replies      
I guess I don't really get the point here. This video walks a line between comedy and fact where I'm not really satisfied in either.

I can't always tell what's a joke, does he actually believe people would write software to compile to ASM instead of javascript because there are a few WTFs on js's "hashmaps." Much likely a newer version will come out before 2035? Or was that a joke?

I also feel like poking fun at "yavascript" at a python conference is cheap and plays to an audience's basest desires.

Really I see a mixture of the following: - Predictions about the future, some of which are just cleary jokes (e.g. 5 year war) - Insulting javascript preferring clojure - Talking about weird shit you could, but never would do with ASM js - Talking about a library that allegedly runs native code 4% faster in some benchmarks, with a simplistic explanation about overhead from ring0 to ring3 overhead.

Kiro 3 days ago 2 replies      
A bit OT but what is the problem with omitting function arguments?
camus2 3 days ago 3 replies      
nice nice,ultimatly languages dont die,unless they are closed source and used for a single purpose ( AS3 ). In 2035,people will still be writing Javascript. I wonder what the language will look like though. Will it get type hinting like PHP? or type coercion? will it enforce strict encapsulation and message passing like Ruby ? will I be able to create adhoc functions just by implementing call/apply on an object? or subclass Array? Anyway , i guess we'll still be writing a lot of ES5 in the 5 years to come.
pookiepookie 3 days ago 0 replies      
I'm not sure I understand the claims toward the end of the talk about there no longer being binaries and debuggers and linkers, etc. with METAL.

I mean, instead of machine code "binaries", don't we now have asm blobs instead? What happens when I need to debug some opaque asm blob that I don't have the source to? Wouldn't I use something not so unlike gdb?

Or what happens when one asm blob wants to reuse code from another asm blob -- won't there have to be something fairly analogous to a linker to match them up and put names from both into the VM's namespace?

yoamro 3 days ago 0 replies      
I absolutely loved this.
ika 3 days ago 0 replies      
I always enjoy Gary's tasks
jokoon 3 days ago 2 replies      
I want a C interpreter
slashnull 3 days ago 0 replies      
ha-zum yavascript
jliechti1 3 days ago 7 replies      
For those unfamiliar, Gary Bernhardt is the same guy who did the famous "Wat" talk on JavaScript:


granttimmerman 3 days ago 8 replies      
> xs = ['10', '10', '10']

> xs.map(parseInt)

[10, NaN, 2]

Javascript is beautiful.

adamman 3 days ago 1 reply      
"It's not pro- or anti-JavaScript;"


h1karu 3 days ago 1 reply      
somebody tell this to the node.js crowd
Fasebook 3 days ago 0 replies      
"I get back to the DOM"
inglor 3 days ago 0 replies      
This is actually not a bad lecture. Very interesting, a nice idea and surprising.
The New Linode Cloud: SSDs, Double RAM and much more linode.com
544 points by qmr  4 days ago   273 comments top 54
madsushi 3 days ago 8 replies      
Why do I pay Linode $20/month instead of paying DO $5/month(1)?

Because Linode treats their servers like kittens (upgrades, addons/options, support), and DO treats their servers like cattle. There's nothing wrong with the cattle model of managing servers. But I'm not using Chef or Puppet, I just have one server that I use to put stuff up on the internet and host a few services. And Linode treats that one solitary server better than any other VPS host in the world.

(1) I do have one DO box as a simple secondary DNS server, for provider redundancy

kyrra 4 days ago 6 replies      
I forgot to benchmark the disk before I upgraded but here are some simple disk benchmarks on an upgraded linode (the $20 plan, now with SSD)

  $ dd bs=1M count=1024 if=/dev/zero of=test conv=fdatasync  1024+0 records in  1024+0 records out  1073741824 bytes (1.1 GB) copied, 1.31593 s, 816 MB/s  $ hdparm -tT /dev/xvda  /dev/xvda:   Timing cached reads:   19872 MB in  1.98 seconds = 10020.63 MB/sec   Timing buffered disk reads: 2558 MB in  3.00 seconds = 852.57 MB/sec
Upgraded cpuinfo model: Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz

Old cpuinfo model: Intel(R) Xeon(R) CPU L5520 @ 2.27GHz

CPUs compared: http://ark.intel.com/compare/75277,40201

nivla 4 days ago 3 replies      
Awesome News. Competition really pushes companies to please their customers. Ever since Digital Ocean became the new hip, Linode has been pushing harder. My experience with them has been mixed. Forgiving their previous mishaps and the feeling that the level of Customer Service has gone down, they have been decent year long. I wouldn't mind recommending them.

[Edit: Removed the bit about DigitalOcean Plans. If you have Ghostery running, it apparently takes out the html block listing different plans]

rjknight 4 days ago 10 replies      
It looks like Linode are still leaving the "incredibly cheap tiny box" market to DO. Linode's cheapest option is $20/month, which makes it slightly less useful for the kind of "so cheap you don't even think about it" boxes that DO provide.
pavanky 3 days ago 2 replies      
I wish Linode (or anyone else other than Amazon) provides a reasonable Plan[1] with GPUs on them.

[1]: Amazon charges $2 an hour thats about $1500 a month.

conorh 3 days ago 2 replies      
Benchmarking using wrk the smallest linode (1024 now 2048) serving a page from an untuned Rails application using nginx/passenger getting almost no other traffic. Hard to compare of course given the various other factors, but produced slightly lower performance after the upgrade. Serving a page from nginx directly (no Rails) had no appreciable difference in performance, I guess the Rails web serving is more vCPU bound?

Before Upgrade:

  Running 30s test @ http://...    5 threads and 20 connections    Thread Stats   Avg      Stdev     Max   +/- Stdev      Latency   308.91ms  135.01ms 985.82ms   80.00%      Req/Sec    14.15      4.61    24.00     66.36%    2206 requests in 30.00s, 28.51MB read  Requests/sec:     73.53  Transfer/sec:      0.95MB
After Upgrade:

  Running 30s test @ http://..    5 threads and 20 connections    Thread Stats   Avg      Stdev     Max   +/- Stdev      Latency   321.74ms  102.45ms 957.74ms   87.32%      Req/Sec    12.02      2.18    17.00     80.75%    1858 requests in 30.01s, 24.03MB read  Requests/sec:     61.92  Transfer/sec:    819.98KB

endijs 4 days ago 3 replies      
Most interesting part in this great upgrade is that they went from 8CPU setup to 2CPU setup.But yeah - 2x more RAM, SSDs will guarantee that I'm not going to switch anytime soon. Sadly I need to wait a week until this will be available in London.
__xtrimsky 3 days ago 3 replies      
I still prefer OVH.comhttp://www.ovh.com/us/vps/vps-classic.xml

for $7 you get:2 cores2GB RAM

for 10$ you get:3 cores4GB RAM

They don't have SSD, but SSD doesn't do everything, I prefer more ram.

EDIT: If some of you don't know OVH, it's because its new in America, but its not some cheap company, it's a European company that is very successful there. And just recently created a datacenter in North America. (I used to live in France, and have known them for some years).

munger 3 days ago 2 replies      
Rackspace cloud customer here These Linode upgrades are very tempting to entice me to switch.

I get I might not be their target market (small business with about $1000/month on IaaS spending) but there are a couple things preventing me from doing so:1) $10/month size suitable for a dev instance.2) Some kind of scalable file storage solution with CDN integration like RS CloudFiles/Akamai or AWS S3/Cloudfront or block storage to attach to an individual server.

I guess you get what you pay for infrastructure components and flexibility AWS > RS > Linode > DO which roughly matches the price point.

relaxatorium 4 days ago 2 replies      
This seems pretty fantastic, I am excited to upgrade and think the SSD storage is going to be really helpful for improving the performance of my applications hosted there.

That said, I am not an expert on CPU virtualization but I did notice that the new plans are differently phrased than the old ones here. The old plans all talked about 8 CPU cores with various 1x, 2x priority levels (https://blog.linode.com/2013/04/09/linode-nextgen-ram-upgrad... for examples), while the new plans all talk about 1, 2, etc. core counts.

Could anyone with more expertise here tell me whether this is a sneaky reduction in CPU power for the lower tiered plans, or just a simpler way of saying the same thing as the old plans?

giulianob 4 days ago 0 replies      
Holy crap this is awesome. Good job guys at Linode. I said I would switch if the prices dropped about 25% because RAM was pricey.... So now I have to switch.
raverbashing 4 days ago 0 replies      
Congratulation on Linode

I stopped being a customer since migrating to DO but my needs were really small

But I think their strategy of keeping the price and increasing capabilities are good. Between $5 and $20 is a "big" difference for one person (still, it's a day's lunch), for a company it's nothing.

However, I would definitely go to Linode for CPU/IO intensive tasks. Amazon sucks at these (more benchmarks between the providers are of course welcome)

nilved 3 days ago 2 replies      
Linode's recent upgrades are awesome, but people are very quick to forget the period where they were being hacked left and right and didn't communicate with their customers until a defensive blog post weeks after the fact. No matter how good the servers may be, Linode should be a non-starter for anybody who cares about the security of their droplet; and, if you don't, why would you pay Linode's premium fee?
SCdF 3 days ago 1 reply      
> Linodes are now SSD. This is not a hybrid solution its fully native SSD servers using battery-backed hardware RAID. No spinning rust! And, no consumer SSDs either were using only reliable, insanely fast, datacenter-grade SSDs that wont slow down over time. These suckers are not cheap.


Not to slam what Linode is doing here, and I'm sure there are probably lots of great reasons to buy datacentre-grade SSDs, but just thought I'd point out that slowing down over time (or data integrity issues) are not really consumer-grade problems any more :-)

ihowlatthemoon 3 days ago 1 reply      
VPSBench result:



  CPU model:  Intel(R) Xeon(R) CPU           L5520  @ 2.27GHz  Number of cores: 8  CPU frequency:  2266.788 MHz  Total amount of RAM: 988 MB  Total amount of swap: 255 MB  System uptime:   8 days, 12:03,  I/O speed:  69.9 MB/s  Bzip 25MB: 8.96s  Download 100MB file: 47.2MB/s


  CPU model:  Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz  Number of cores: 2  CPU frequency:  2800.086 MHz  Total amount of RAM: 1993 MB  Total amount of swap: 255 MB  System uptime:   2 min,  I/O speed:  638 MB/s  Bzip 25MB: 5.10s  Download 100MB file: 146MB/s
Test: https://github.com/mgutz/vpsbench

harrystone 3 days ago 0 replies      
I would love to see them still keep all those old disks and sell me some huge, cheap, and slow storage on them.
orthecreedence 3 days ago 3 replies      
Bummer, they're taking away 8 cores for the cheap plans and replacing it with 2. Does anyone know if the new processors will offset this difference? I don't know the specs of the processors.

Linode's announcements usually come in triples...I'm excited for number three. Let's hope its some kind of cheap storage service.

vidyesh 3 days ago 1 reply      
So this makes Lindode practically on par with DO's $20 plan. Up till now $20 plan at DO was better now its just the choice of the brand.

But here is one thing that DO provides and I think Linode too should, you get the choice to spin up a $5 instance anytime in your account for any small project or a test instance which you cannot on Linode.

davexunit 4 days ago 4 replies      
Cool news, but their website now has the same lame design as DigitalOcean. I liked the old site layout better.
jrockway 3 days ago 1 reply      
A nice reward for those of us who have been using Linode from before they even had x86_64 images.
mwexler 4 days ago 1 reply      
There's similar and then there's alike. I guess it makes comparison easy, but imitation certainly must be the sincerest form of flattery:

Compare the look and feel of https://www.linode.com/pricing/ and https://www.digitalocean.com/pricing/

ksec 3 days ago 0 replies      
Sometimes i just wish the pricing system would get better as you go larger.

What is the difference between the 16GB - 96GB Plan and a dedicated server? And why would i pay 3x the price? The advantage of those who offer Cloud / VPS and Dedicated Servers Hosting company is they can mix and match depending usage. If you are actually building an any sort of infrastructure with Linode those large box are extremely expensive.

__xtrimsky 3 days ago 0 replies      
Could someone please explain what improvements can we get from SSD for web applications ?

I know it would read files faster, but in most cases reading a couple of PHP files is not such a big improvement.

My guess would be maybe databases ? Read time improvement for MySQL ?

rdl 3 days ago 0 replies      
Semi-related: does anyone know of any good (but still fairly cheap) providers doing Atom C2750/C2758 servers yet?
extesy 4 days ago 2 replies      
So now they match DigitalOcean prices but offer slightly more SSD space for each plan. I wonder what DO answer to this would be. They haven't changed their pricing for quite a while.
jevinskie 4 days ago 0 replies      
I resized a 1024 instance to 2048 last night and it looks like it is already running on the new processors (from /proc/cpuinfo): model name: Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz

Should I upgrade? Do I want 2 x RAM for 1/2 vCPUs? =)

ausjke 3 days ago 0 replies      
This is great indeed. I'm happy Linode did this.I ran below command 10 times and used the average below:

dd bs=1M count=1024 if=/dev/zero of=test conv=fdatasync

Linode:1073741824 bytes (1.1 GB) copied, 1.09063 s, 985 MB/sD.O:1073741824 bytes (1.1 GB) copied, 3.23998 s, 331 MB/s

dd if=/dev/zero of=test bs=512 count=1500 oflag=dsync

Linode:768000 bytes (768 kB) copied, 0.478633 s, 1.6 MB/sD.O:768000 bytes (768 kB) copied, 1.01716 s, 755 kB/s

filmgirlcw 3 days ago 1 reply      
Shall we call this the DigialOcean effect?
level09 3 days ago 0 replies      
I would probably move back from Digital Ocean if they allow a 10$/mo plan.

I know that's not a big price difference, but some website really don't need a lot of resources. they work well on D.O's 5$ server, and I have really a lot of them.

bfrog 4 days ago 3 replies      
I'm actually a little unhappy, it looks like they reduced the CPU count for my $20/mo instance. At this point there's basically no reason to stay with them now.
corford 3 days ago 1 reply      
Big shame the new $20 plan now only offers 2 cores versus 8 with the current plan. For my workloads, I don't need 2GB RAM or SSD disks, I just need the cores :(
h4pless 4 days ago 2 replies      
I notice that Linode talked a good bit about their bandwidth and included outbound bandwidth in their pricing model which DO does not. I wonder if DO has a similar model or if transfer capacity the only thing you have control over.
funkyy 3 days ago 0 replies      
I would love to see Linode going to large HDD drives option for storage as well. I am dying to find really inexpensive cloud provider with cheap data space (SATA is fine), reasonable bandwidth but low cpu and ram and Linode style support/caring. Give server with ~500 GB hard drive, 2 TB outgoing transfer, 1 core and 1 GB ram for ~$20-30 and I am all yours.
jaequery 4 days ago 0 replies      
im really impressed by their new CPU specs. from experience those aren't cheap and it's possibly the fastest CPU out in the market. combined with the SSDs, it may be that Linode currently is the fastest of any cloud hosting right now.
vbtechguy 3 days ago 0 replies      
Updated benchmark results with 2GB vs 4GB vs 8GB vs 16gb plans from Linode vs DigitalOcean https://blog.centminmod.com/346. Definitely Linode has the faster cpus and disk i/o as you move up in plans >2GB. 16GB plans are pretty close though if you look at subtests in UnixBench and ignore the subtests affected by different base Linux Kernel versions used.
Justen 4 days ago 1 reply      
Higher specs sound really nice, but on HN I see people commenting on the ease of DO's admin tools. How does Linode's compare?
kijin 3 days ago 0 replies      
About a week ago, I wrote a comment in another Linode-related thread asking how the new usage patterns that hourly billing encourages might affect CPU contention. At the time, I received 11 upvotes but no replies. Apparently, quite a few people were interested in my question but had no useful conjectures to share.


Now it's obvious what Linode's answer to that question is: Lower "burstable" CPU for lower plans.

The $20 plan used to be able to burst to 8 cores for short periods, but now it only has access to 2 vcores. The "guaranteed" processing power is probably higher with the newer CPUs, but at the expense of short-term burst performance.

Another minor detail that I find interesting is that the transfer cap for the $20 plan has been increased to 3TB, whereas the $40 plan still gets 4TB. Apart from the transfer cap plateau-ing at the extreme high end, this is the first time that Linode has broken its 11-year-old policy of "pay X times as much money, get X times as much RAM/disk/transfer".

jaequery 4 days ago 1 reply      
DO's biggest problem is their lack of "zero-downtime snapshot backup and upgrading". i've not used Linode but anyone know if theirs is any different?
shiloa 3 days ago 1 reply      
I have mixed feelings about this. We're in the process of moving from Linode to Rackspace but haven't flicked the switch just yet - was planning to this weekend.

Our Linode server (16 GB plan) has been performing terrible lately wrt I/O (compared to, say, a Macbook Pro running the same computations), and we decided we've had enough. I guess we'll have to compare the two after the upgrade and decide.

jebblue 3 days ago 0 replies      
I was looking into alternatives but now I'll stick with them, I can't find another cloud provider whose stuff works so well.

edit: I just finished the migration, my disk speed test is through the roof, free ram is phenomenal!

Kudos 3 days ago 0 replies      
Ubuntu 14.04 LTS is now available on Linode too.
beedogs 3 days ago 0 replies      
This is nice to see. SSD has gotten ridiculously cheap lately.
jdprgm 3 days ago 0 replies      
This is really a fantastic upgrade. I've been hosting with Linode for a few months now and been very happy with them. I run a relatively transfer intense SaaS app and a 50% transfer increase makes quite an improvement.
ForFreedom 2 days ago 0 replies      
The specs for $20 is same for DO and Linode excepting for the 8GB extra HDD on Linde.
ff_ 3 days ago 0 replies      
Wow, that's beautiful. Currently I'm a DO customer (10$ plan), and if they had a 10$ plan I'd make the switch instantly.
dharma1 3 days ago 0 replies      
ohhh yesss. DO is good for some locations like Southeast Asia but loving this upgrade for my London and Tokyo Linodes
icantthinkofone 3 days ago 0 replies      
Without FreeBSD support, it means nothing to me.
hyptos 3 days ago 1 reply      
wow EC2 instance free plan :

$ dd bs=1M count=1024 if=/dev/zero of=test conv=fdatasync

1024+0 records in

1024+0 records out

1073741824 bytes (1.1 GB) copied, 35.8268 s, 30.0 MB/s

mark_lee 3 days ago 0 replies      
awesome, linode or DO, if you're small or media companies, no other options should be considered at all, even AWS or Google Cloud.
EGreg 3 days ago 0 replies      
I love linode. I switched from slicehost for its 32bitness back in the day, stayed for the awesome culture and independence. Slice host got sold to rack space.

However, I am seriously considering a move to Amazon Web services for one main reason: I need to decouple the hard drive space from the ram. The hard drive space is so expensive on linodes!!

notastartup 3 days ago 1 reply      
These upgrades are impressive but they are a bit too late to the game. DO still has these advantages besides the cheap monthly price:

- DO has excellent and easy to understand API- Step by step guides on setting up and running anything- Minimal and simple

To entice me, it's no longer just a matter of price, DO has extra value added, largely due to their simplicity.

kolev 3 days ago 1 reply      
Goodbye, Digital Ocean!
zak_mc_kracken 4 days ago 1 reply      
Does any of LINode or DigitalOcean offer plans without any SSD? I couldn't find any.

I just want to install some personal projects there for which even SSD's are overkill...

izietto 4 days ago 0 replies      
Do you know cheaper alternatives? Like DigitalOcean, as @catinsocks suggests
New axe design uses lever action to make splitting wood easier vipukirves.fi
532 points by sinned  3 days ago   154 comments top 48
jeffbarr 3 days ago 4 replies      
I bought a Vipukirves axe early this year. I have a wood-fired pizza oven in my backyard and use small, nicely split pieces of wood in order to retain some control over the fire.

Before buying the axe I used a four-sided wedge (basically an elongated pyramid) and a sledgehammer for splitting.

So, how does this axe perform? Overall, I am very happy with it and proudly show it off at every opportunity. After spending some time learning how to use it, I can report that, for some types and conditions of wood and with the right grip on the handle, it truly does split wood in the manner shown in the video.

As noted in the other comments, certain types of wood are easier to split than others. After my pizza oven was finished, I somewhat foolishly bought a 1/2 cord of apple wood from the apple-growing region of Washington state. This wood is incredibly dense and has proven difficult to split by any means, even after 3 years of seasoning. The splitting issue is made worse by the overall knottiness of the wood.

I also bought a 1/2 cord of mixed wood from a local supplier. The axe is at its best on straight, dry, knot-free pine, oak, cedar, and so forth. The vertical motion is translated into horizontal motion milliseconds after the blade of the axe penetrates the wood and the split-off portion flies to the side with explosive force, often landing 8 to 10 feet away.

It took me an hour or two to learn to use the axe properly, with a relaxed grip to allow the head to rotate after it strikes the wood. Wearing gloves (recommended in any case) can make this even easier.

The blade of the axe is not razor sharp and does not require sharpening or other maintenance.

linhat 3 days ago 3 replies      
This axe looks really awesome, physics for the win.

Also, instead of using an old rubber tire, I highly recommend building a variable length, tensioning chain, much like this one: http://www.youtube.com/watch?v=wrLiSMQGHvYMakes chopping wood so much more fun.

And then, there is also the stikkan: http://www.stikkan.com/Perfect to hang it up next to your fireplace to do some more fine grained wood chopping, cutting larger pieces into smaller ones.

ghshephard 3 days ago 5 replies      
That has to be some of most seasoned knot free wood ever split. I wonder how many logs he had to go through to find stuff that split that well - 20 logs for every one that did that?

Sledge Hammer, Splitting Maul gets the job done 95% of the time. "Eccentric Axe" the other 5%.

Well - maybe, 85% Splitting Maul 12% Eccentric Axe, and 3% splitting wedge (which typically has a torsion in it to create a turning effect to split the wood.)

I would love to hear of an independent comparison of the Eccentric Axe versus a Splitting maul.

The tire is a really great idea though.

binarymax 3 days ago 1 reply      
I, as well as others, have a history of strained wrists when splitting wood with a traditional axe. Clicking through to their website they recommend a loose grip when the head is about to strike, allowing for the rotation to take place. This makes sense because before you'd need a strong grip to hold onto your axe to make sure it doesn't slip out of your hands when giving a swing strong enough to split. Since you are swinging much more gently, this may actually work! At the very least I am glad that I don't need to split wood these days otherwise I'd give this a try in a second.
fit2rule 3 days ago 0 replies      
I thought it was already common knowledge among the axe-wielding cognescenti that the way you chop wood, without tiring yourself out completely, is that you add a little 'twist' to your down-swing, just as the blade makes contact, that has the same leverage effect - albeit with a 'normal' blade.

I dunno, I guess I just learned that little twist from my uncle and grandfather, and never really thought it was so magical. Not sure how I feel about the safety of a mighty wood-cutting sharp blade being off-balance on the downswing - sure, the guy in the video has a fairly safe setup, but if you don't have the luxury (i.e. are a consumer who just bought one of these Wonder Axes) of having a safety rig, the potential for mis-direction and glancing blows from the axe being redirected towards the user seems pretty high ..

ggchappell 2 days ago 1 reply      
This is an interesting idea. It's amazing to think that a tool like the axe -- which has been around for millennia -- can still be improved.

[OTOH, I've split a fair amount of wood. And I'd have to say that anyone who thinks splitting wood with an axe is a good way to produce fuel for the primary heat source for a house, is pretty much insane. It's exhausting work. Get a powered splitter for goodness sake. :-) ]

wcfields 2 days ago 0 replies      
And for what is probably the least safe way to split wood, http://www.thestickler.com/

It attaches to your car-axle and is a giant screw that splits wood.

easy_rider 3 days ago 2 replies      
This is really cool. I've never split more than a couple dozen logs in my life but Ray Mears teached me to wedge the blade to the side on impact. This seems to emulate these physics. I'm not a big guy, and it's a pretty hard motion to get in to when you're swinging down as hard as possible.

This seems like a very capable survival/bushcrafting tool for less accomplished wood cutters.

tl;dr I wood buy.

barkingcat 3 days ago 1 reply      
ISL 2 days ago 1 reply      
This axe isn't the only one that can deliver such a video. Good wood and a proper splitting axe allows similar results.

The tire trick, or a bungee cord, can allow big gains in speed.


lafar6502 3 days ago 0 replies      
For me the greatest part of this innovation was putting the log into a car tire.
josefresco 3 days ago 0 replies      
A lot of folks here are claiming that the type of wood being split makes a huge difference and it's true, however...

To address this the video should show him using both a conventional axe, and his version on the same wood. Apologies if he does this later in the video as I didn't watch the entire thing.

bprater 3 days ago 4 replies      
Knotted wood is much more challenging to split.

Even with a hydraulic splitter, chunks of wood with lots of branches can stall a machine pressing with tons of force. Great demo, but unrealistic unless you only chop beautiful limbless tall oaks.

3rd3 2 days ago 0 replies      
Thats a typical tool one might find on Cool Tools: http://kk.org/cooltools/

They recently published a book wich is a lot of fun to browse through on a lazy afternoon.

baq 3 days ago 2 replies      
if you ever tried to use an axe, the video in the article will look like sorcery.
dynofuz 12 hours ago 0 replies      
Looks like this is more primitive knowledge being rediscovered. Note how cavemen axes were stones tied to the side of a stick.http://www.cbsnews.com/news/evidence-uncovered-of-worlds-old...
hrkristian 3 days ago 2 replies      
I've spent nearly every winter growing up swinging axes, and I cringed a bit when I saw him strike branches.

For the most part the axe does a wonderful job against anything, and that guy has ridiculously good aim, but anyone who have at some point been bad at chopping wood probably knows those twists to the side can do a real number on your wrists and hands. It seems to happen quite a bit.

It's still an amazing innovation, and I hope to be able to pick one up as a gift. The article is sadly not very informative.

yukichan 2 days ago 5 replies      
This may be an unpopular reply, but splitting fire wood sounds like you're going to be making an open fire. Open fires are a danger to your health:


> "The unhappy truth about burning wood has been scientifically established to a moral certainty: That nice, cozy fire in your fireplace is bad for you. It is bad for your children. It is bad for your neighbors and their children."


> "Smoke is made up of a complex mixture of gases and particles produced when wood and other organic matter burn. A major health threat from smoke comes from fine particles (also called particle pollution, particulate matter, or PM). These microscopic particles can get into your eyes and respiratory system, where they can cause health problems such as burning eyes, runny nose, and illnesses such as bronchitis. In addition to particle pollution, wood smoke contains several toxic harmful air pollutants including: benzene, formaldhyde, acrolein and methane."

iamthepieman 2 days ago 0 replies      
For splitting wood from a fallen tree or cleaning up around the property a regular axe and splitting maul are fine. When I've had to split enough wood to burn for the winter I always just rent a hitch-mounted hydraulic splitter. 60 bucks and I can do 5 cords in day.

This tool could possibly take the place a of a splitting maul but a great advantage of the maul is it can be used to split knotty wood that doesn't have straight grain. It takes a bunch of swings and force but does get the job done. With the wide head of the Vipukirv it wouldn't be able to get down to the center of a half-split log.

jotm 3 days ago 0 replies      
This would hurt sooo many newbies... You're much better off with a normal axe that won't try to twist and jump out of your hand every single time.

For wood cutting, the size of the axe head matters a lot - too small and it doesn't have enough force, too large and it gets stuck very easily.

The length of the handle is also important - you'll fare much better with longer ones, but the longer it is, the harder it is to aim and control.

As with anything, practice is key, but I'm pretty sure you don't want to start with this axe.

orky56 1 day ago 0 replies      
The key here is that the axe splits the wood rather than forcefully gets into the wood. Not sure how this is different? Try turning the wood on its side and see how the axe works then. By striking it parallel to the wood structure, it separates the wood from the point of impact. If one were to strike it orthogonal to the wood structure, this would be pretty pointless.
kofejnik 16 hours ago 0 replies      
You can get the same effect using an angled strike with a regular axe, here's some Russian dude demoing it with a big fresh pine log: https://www.youtube.com/watch?feature=player_detailpage&v=Os...
marktangotango 3 days ago 0 replies      
I'm amazed how many of you have used an axe. Nothing to add, you all have said it all :)
johngalt 2 days ago 0 replies      
Interesting cross between an axe and a froe. Gives you the levering action of a froe in one motion.
micro_cam 2 days ago 0 replies      
Neat idea and i'd love to try it but i'd need to see it side by side with a good maul which makes use of physics in a few ways.

Especially with dry lodge pole pine I can do about as well as the guy in the video does with my six pound maul and I'm our of practice though I split a lot of wood as a kid.

On a related note, I wonder if anyone on hacker news has a favorite wood stove...I'm really intrigued by the new efficient, low emission ones but they are costly and in depth reviews are sparse.

Zigurd 2 days ago 0 replies      
This makes me want to instrument my wood splitting. I suspect that I will find that improving the fast splitting is less important than avoiding or improving the outcomes for the toughest pieces to split.

A bit like finding that a faster cache that doesn't reduce cache misses or the cost of cache misses can disappoint when measuring overall performance.

GalacticDomin8r 2 days ago 0 replies      
It's pretty easy to known who really knows how to split wood with any kind of axe. You start on the outside working your way in splitting off sections that look like they present the best chance to avoid a knot. Anything other than that is snake oil and bluster.

And no, I'm not buying one of these axes, a traditional splitting axe works very well. I don't see how the featured axe could offer anything beyond the traditional since you aren't constrained to a plumb blow as the marketing would have you to believe.

sbierwagen 2 days ago 1 reply      

  Leveraxe is faster than a hydraulic splitter.
Faster than a small one, maybe. The larger ones are pretty damn quick, as you'd want them to, for the price: https://www.youtube.com/watch?v=knZkc_vzGUE

emiliobumachar 3 days ago 0 replies      
This is one of those inventions, like the hot-air baloon, for which we had all the prerequisite technology for ages.
3pt14159 2 days ago 0 replies      
I needed to break up a bunch of concrete back in the day, so I bought a medium duty jackhammer. Later, I discovered the joy of easily splitting wood.
johnobrien1010 3 days ago 0 replies      
From the website, it looks like it costs $266. I think you'd have to being chopping a lot of wood for it to be worth it to get that expensive of an ax, even if it is better.
paul_f 2 days ago 0 replies      
You know what really annoys me? People are now so used to gas starters or starter logs, that they don't split their wood at all. Just pile up the whole pieces and try to start a fire. Then fight it for three hours and end up with smoke a bunch of half burnt wood.
joryhatton 2 days ago 1 reply      
First question of the FAQ...

"Question 1: Can VIPUKIRVES be used by a woman?"

mowfask 1 day ago 0 replies      
25cm birch logs surely are fun to split. With any kind of axe.

We use to chop 1m (~3feet) logs of a lot more compact wood (oak, beech), and this looks like a toy to me. Has anyone seen it used on more serious logs?

dirktheman 3 days ago 0 replies      
Accurate axeman, but with that kind of neat log I guess you can have the same results with a regular axe and a good technique.

It's very hard to improve something that has been around for the past 35,000 years, I guess. Maybe they're better of working on that car tire so that it can accomodate logs of different diameters.

VaedaStrike 2 days ago 0 replies      
Is it just me or does it really look like an incisor?

I wonder if similar physical principles are at work in making their appearance similar.

KhalilK 3 days ago 0 replies      
I am most impressed by the use of the tire to keep the wood upright and positioned.
vanderZwan 2 days ago 3 replies      
As a left-handed person: great, another lethal tool optimised for right-handed people and therefore more unwieldy and dangerous for us.
Theodores 3 days ago 1 reply      
This famous quotation kind of needs an update:

"Give me six hours to chop down a tree and I will spend the first four sharpening the axe." - Some Famous American Bloke

jhallenworld 2 days ago 0 replies      
Off topic perhaps, but I switched from oil/steam to gas forced hot air last year. Between the 96% efficient furnace and new gas hot water heater (compared with the 75% oil / steam with super inefficient continuous hot water attachment) and the fracking boom, I can report that gas is much, much cheaper: for me $3500 -> $700 / year for a 1100 sq/ft apartment near Boston.


A heat pump with ground heat-sink may be a better long term option, but it's much more expensive. Also I'm pretty sure the gas is more efficient when considering the power plant and transmission line losses.

Even better would be a home gas co-generation plant. Honda has such a thing, but again not cheap:


zomg 3 days ago 0 replies      
there's nothing new or innovative here. people have been doing this for years... it really depends on the KIND of wood you're chopping. this guy has it down to an art: https://www.youtube.com/watch?v=2vThcK-idm0
praeivis 1 day ago 0 replies      
After awhile your hands will hurt. For few choppings per day maybe not bad, but price...
anactofgod 1 day ago 0 replies      
Archimedes would approve.
arca_vorago 3 days ago 1 reply      
This looks dangerous to me. I grew up in the mountains, and consider them my home. (The kind of place where the nearest walmart is an hour and a half drive away, and everyone shares their elk kills and logs and chops their own wood)

One of the most dangerous things an axe can do is go sideways on you on the hit, that sideways rebound or if the log falls away follow through can mean an axe in the foot or leg for someone who has been chopping for a few hours and is tired.

Also, proper drying of logs before splitting can't always be done, and that log look very dry and doesn't look knotty.

Honestly when I chop wood I keep three tools near, the normal sliceaxe, a skinny thin and fast one for easy stuff, a larger wedge with splitter arms (http://www.thehulltruth.com/attachments/dockside-chat/282042...) and a normal handless wedge and hammer for the really hard stuff.

Just thought I'd share a little bit of info from a guy who spent many hours in his youth chopping logs. If I was getting paid, I did sometimes cheat and use a gas powered splitter... if you have two people working it the throughput can be much higher than two people splitting normally.

adambware 2 days ago 0 replies      
Such beauty in the simplicity of this concept!
viggity 3 days ago 0 replies      
cool axe. FWIW, an easier/more efficient way of keeping the wood together while spiting it is a bungie cord. Using a tire only works if you have a relatively big logs.
tzury 3 days ago 1 reply      
This is the page you need to read!


UweSchmidt 3 days ago 3 replies      
Alternative title:

"New axe design uses lever action to make splitting wood a lot easier"

"Uses physics" sounds like "Stand back: I'm going to try science" to me.

Stock Photos That Dont Suck medium.com
528 points by Redsprows  1 day ago   64 comments top 29
salimmadjd 1 day ago 1 reply      
I have few issues with this post:

1 - The quality of images are not that good. Some are okay, but overall not that good.

2 - Many microstock sites have large volume of decent looking images and you can always find something decent.

3 - You end-up spending more time on these sites trying to find a decent image than you save money.

4 - Photography (quality) is arts and craft. Because of the higher supply the price have come down but they still need to make money.

5 - Model release? Just looking at one of the sites [1] one of the images had a photo of a skater [2]. I'm not sure if the skater had given a model release or not. But at least with more reputable stock sites you know there is always a model release required.

[1] http://unsplash.com/

[2] https://s3.amazonaws.com/ooomf-com-files/7erBZvZMQ2mmuFQ10vc...

edit: formatting

instakill 1 day ago 1 reply      
Unfortunately, free stock photos do suck.

I've known about unsplash and most of the other resources for a while. This weekend I needed to find some images for a landing page I was creating.

The photos you find on websites like death to the stock photo are better quality than most pictures any of us will take. They also have that hipster feel that a lot of start-ups go for with their visual communication. The problem, as semerda has already mentioned is that you have to wait periods at a time to get a small collection of photos. Unless you are extremely lucky, none of those photos will match what you require.

The other problem is that ostensibly, photographs with people and face convert far better than images of landscapes or abstract imagery, which is what most of these websites provide. If they do happen to have people then the uncertainty of release is in question (again, already mentioned in this thread).

Searching for good photographs is really hard. Especially if you don't want to lax your visual requirements. I ended up spending 4 hours searching for a good image to use on Fotolia, a professional paid-for service. If I stuck to free services, I'd still be searching.

What DOES suck about stock photography is understanding the licensing with paid-for sites. A lot of it seems riddled with ambiguity and legalese. For instance, the photos I paid for yesterday - I downloaded them but wasn't asked to provide a domain. Should I be expecting a copyright infringement notice when their bots scan my website? I'll then have to prove I bought the license to use that image. A bit of an inconvenience and sometimes a scary affair.

Another thing that sucks is the ubiquity of white people in stock images. Being in South Africa, and trying to serve the South African market as my primary market, I wanted to find great photos of black people. I'll tell you, it's not easy. Also, comparing about 100 photos, I can anecdotally say that images of black people usually cost less credits than those of white people.

What I do think they are great for is imagery to use for blog posts. I often use them [1] for blog posts that I write. Just shell out some money for paid photography if you're serious about using photos properly.

[1] http://blog.mybema.com/2014/04/10/improving-Mybema-test-suit...

emw 1 day ago 1 reply      
Wikimedia Commons has a large set of curated, high-quality, free photographs: https://commons.wikimedia.org/wiki/Commons:Featured_pictures, https://commons.wikimedia.org/wiki/Commons:Quality_images.

There are also many, many gems among the 20+ million other free images at https://commons.wikimedia.org.

The site has an extensive category system that complements keyword-based search, e.g. https://commons.wikimedia.org/wiki/Category:Fruit.

semerda 1 day ago 0 replies      
Looking at some of the links on this site to supposed "stock photos that don't suck".. they are just awful in quality. Many are underexposed, poorly composed & out of focus. Also some of those sites ask for an email address so they can spam me with few pics they select every week. I just want to find a pic today that I want not something sent to me what someone else wants. Just awful.

Having been an istockphoto photographer/contributor since 2005, I can tell you that to be an istockphoto or getty photographer/contributor the bar is high. You need to pass a photography test and every photo uploaded is scrutinized by pros so it doesn't suck. So that it's exposure, framing, quality etc is top notch. Therefore guaranteeing quality at a price.

If you want free photos, you have a better chance of finding ok photos on flickr. When you find one you like contact the photographer and ask if you can use their photo giving them credit on your site. Just be careful with faces in a photo. Sites like istockphoto force all photographers to sign a Model Release form which means no surprises for the people appearing in a photo.

sejje 1 day ago 1 reply      
I built a search engine for flickr CC images. Most of them require attribution (the license is up to the photographer).

Flickr's search is not perfect (it is based on "interestingness" over relevance), but after a page or two typically the results are passable.

It provides some nice default attribution HTML. Basically I was just scratching my own itch. YMMV.


MichaelApproved 1 day ago 1 reply      
I've always wondered what the copyright law was surrounding these sites.

These sites get submissions from artists who are granting them a license to distribute and offer these images royalty free. Great.

But what if an artist uploads an image to which they don't actually own the copyright. I presume that these sites are protected under copyright laws because they're not liable for content distributed on their networks but what happens to the person who used the image? I understand that they wouldn't be protected and still be liable.

glenda 1 day ago 2 replies      
I wouldn't say that a photo like [0] sucks any less than something like [1] at all. In fact the images listed on http://littlevisuals.co/ are just plain bad imho.

The only difference I see is that the images listed in this article come with 'filters' already applied. I think for most use cases this is actually a negative; I would never consider using an image like that for something serious - it comes off as really cheesy, even more-so than the shutterstock photos. At least with those I am in control of the post-processing.

[0] http://designrope.com/wp-content/uploads/2014/04/stock-photo...

[1] http://image.shutterstock.com/display_pic_with_logo/637321/1...

gk1 1 day ago 3 replies      
A surprisingly good list. I expected the usual CC galleries on Flickr or elsewhere, or Getty Images' new (and free) embedding option.

As mentioned by someone else, I don't get the "photos sent to you weekly/monthly" sites. Why? I can't picture (har har) someone needing a photo for some design, but thinking "Welp, nothing good on the web... I'll wait to see what comes along next week." Unless it's just aimed at photography enthusiasts who enjoy seeing beautiful photos, with no intent to use them for something.

I'll also add that if you're looking to add photos to your content, consider being different and using a funny cartoon drawing instead, like Jason Cohen, 42Floors, and others have been doing. Here's a stock cartoon site I created just for that purpose: https://www.gagcartoons.com

baby 1 day ago 0 replies      
The thing is, I don't want to receive stock photos daily/monthly. I want to be able to search through stock photos to find RELEVANT ones to my needs. When I need them.

I mostly use flickr and check creative commons and all the right radioboxes in the search. But yeah most of the time they suck.

bjelkeman-again 1 day ago 0 replies      
One of the things which we decided when started up was to not use any stock photography. I think we have mostly managed. Instead we focused on taking a lot of pictures. It feels to us, and we hear a lot from others, that it gives a genuine voice to our comms work. Which is really important for us. The photography is spread over a number of Flickr accounts, but we know where it is.

My colleague Mark has a bunch of it: https://www.flickr.com/photos/charmermrk/collections/7215760...

_august 1 day ago 1 reply      
Ha, I had just put up my launch page [0] yesterday using the blue mountain image. It was kind of weird to see it on top.

This article seems directly copied from this medium post [1] I found a while back, except for the addition of paid stock photo sites.

[0] http://www.wanderdash.com/[1] https://medium.com/great-reading-for-startup-founders/62ae4b...

arb99 1 day ago 0 replies      
sxc.hu which recently changed to http://www.freeimages.com/ has a decentish collection. been using it for years when quality isn't too important.

but i tend to use photodune.com for anything a bit more serious, the images vary from $1 to $5, and they have many to choose from. But if they don't have anything good enough then its over to istockphoto.

of course there is also the getty embed thing they announced a while ago (http://www.gettyimages.com/Creative/Frontdoor/embed) which might be handy if you just want small blog content.

lubujackson 1 day ago 0 replies      
Alternatively, if you want to search for a more specific image, this is pretty convenient: http://nuggety.com/u/nuggety/top-image-search-engines
zosegal 1 day ago 0 replies      
A newly discovered favorite of mine is http://stocksy.com - As far as I know it's invite only and/or has a pretty strict approval process for photographers that want to sell on their site. From what I've seen, the quality is often higher than your typical iStock. It's probably a little more expensive than iStock, but nowhere near the cost of Getty where you can spend several hundred to thousands of dollars per photo. The biggest downsides I've noticed so far is that their search/refinement filters are not the best, and they don't have the quantity that other larger stock photo sites do.

Here's an example:




uptown 1 day ago 0 replies      
Not free, but Dollar Photo Club (www.dollarphotoclub.com) seems to be one of the more cost-effective offers I've come across in recent months. Certainly cost-effective for anyone requiring a steady stream of stock photos for blogging or some other purpose.
_pmf_ 1 day ago 0 replies      
"Ethnic people around whiteboard, discussing important business things" -- artist unknown
notdan 1 day ago 0 replies      
Not free, but I've found that http://photodune.net has good stock photos that aren't too expensive ($1-$5 per image and usually the $1 or $2 image is large enough for a website). They are much cheaper than istockphoto, and they are priced in dollars, not credits, and they allow you to buy a single photo without subscribing to a monthly plan, buying lots of credits in bulk, etc.
Turing_Machine 1 day ago 0 replies      
Cool. Several were new to me. Here's another good one: http://www.morguefile.com/
aaron987 1 day ago 0 replies      
The photos from deathtothestockphoto look decent, but I see no central place to search through them. You can only get them by email. Their Instagram feed only has 46 photos. The rest of these look like the usual generic photos.

However, it is refreshing finally find some sites that actually offer free photos. Most sites that pop up in a Google search have a few free ones, then billions of affiliate links to Shutterstock. Nothing against Shutterstock, but it does get frustrating to search for free images and constantly end up on a page requesting payment.

Theodores 1 day ago 5 replies      
I watch TV and I believe that the ethnic mix of people on TV (in the UK) is fairly representative of the mix you would find on the train going in and out of London.

I look at marketing material prepared with stock images and fashion catalogues. What do I see? Lots of white faces. Maybe there is a picture in there somewhere of some black girl playing with young Tarquin (who is obviously white). The word 'tokenism' springs to mind.

I don't believe most people care less about what colour people are, however, if you use a lot of stock photography, step back, look at the body of your work and wonder why all the faces are white. It happens. Sometimes all the images that are supplied are of white people, yet you may not notice this until, for some random reason, you get a person of colour showing up. Then the '98% white' trend is revealed.

The thing is that not everyone wants to be white. People might want to be young, thin, beautiful, with a full head of well groomed hair - that is aspirational. So you can keep the old, fat and ugly people out of the marketing material - that is fine. But to consciously or otherwise end up with only white faces is where TV was more than a generation ago. We should have moved on, perhaps even more so than TV as the internets is global rather than parochial.

will_lam 1 day ago 1 reply      
I'm surprised 500px wasn't on this list.
klunger 1 day ago 1 reply      
Not free, but high quality and much cheaper than shutterstock etc: http://yaymicro.com/view.action
RighteousFervor 1 day ago 1 reply      
This post has been getting a lot of mileage on both HN and layervault. Perhaps disproportionately.
kunle 15 hours ago 0 replies      
pcurl 1 day ago 0 replies      
Here is the original article that I assume this Medium post was based on. Seeing as they essentially share the same title. http://designrope.com/design/find-stock-photos-dont-suck/
spenuke 1 day ago 1 reply      
Anyone know of a clip-art (non-photo illustration) version of this list?
moron4hire 1 day ago 1 reply      
I think the key thing is to not underestimate what you can use for imagery. It's just not necessary to have people standing around a conference room table with a Cisco phone in the middle that they're all leaning towards (except Carl in the back, dammit Carl) when talking about business services.

From all of the things I've ever read or written, nobody cared what the pictures were, unless the pictures were explicitly diagrams or illustrations. Otherwise, it was almost white noise.

I say "almost" because, apparently, they still want for a picture, any picture, in the slot. I don't get it, what is the point of having a picture if the reader never shows a discernable reaction to the content of the picture. But you need to have a picture there. It could be a picture of a pie. It could be a picture of your city's skyline. It could even be a cat (actually, it probably should be a cat, you will probably get more hits). But A) it needs to be there, and B) it doesn't matter what it is.

For that reason, I have a strict "no purchase stock photography" rule. I have in the past, and it just wasn't worth the money. I can throw together what I need in so little time that it's not worth the money. And I'm not a graphic designer, this is just how unimportant the content of the image is.

pearjuice 1 day ago 0 replies      
Stock photos suck by definition. They give an unrealistic and overly edited/positioned view of reality. If you are using stock photos, you are in line with infomercials and false claims.
dirtyaura 1 day ago 1 reply      
Resembles an older post by Dustin Senos with almost the same list and the title.


Gabriel Garca Mrquez, Literary Pioneer, Dies at 87 nytimes.com
493 points by antr  3 days ago   160 comments top 32
simonsarris 3 days ago 8 replies      
Oh my. A paragon of magical realism and my second favorite author. Rest in peace.

Liking storytelling alone is sometimes not enough to like Marquez, you have to love language too. He uses (some might say abuses) language to impact his storytelling, often using incredibly long, convoluted sentences to weave his narrative. It can be hard to follow, sometimes intentionally, but I find it enormously satisfying to read and follow along with his brain. Like slowly drinking a maple syrup of words.

One of the best examples is the first 15 or so[1] pages of Autumn of the Patriarch[2], where the narrator winds this thread of what has happened slowly, using sentences that span pages, until you realize a shift from what has happened to a sort of what is about to happen. Then a fist slams on the table and the realization strikes you that the first part of the description was a kind of set up, this beautiful ruse. I wish I could be more descriptive but it would give away the delight. It's a great book about terror and despotism.

Marquez is not the kind of thing you can read in a noisy environment. At least I can't. I adore him so much. I could write a eulogy for days.

If you've never read him, please take a moment to read one of my favorite short stories, A Very Old Man With Enormous Wings


(I've hosted a copy of it (and many more short stories) for ages because most of the copies on the web are plagued with ads and miserable formatting)

If One Hundred Years of Solitude seems too long for you, I urge you to look into some of his very excellent shorter books, such as Autumn but also Of Love and Other Demons[3] and Love in the Time of Cholera.[4]

(Chronicle of a Death Foretold is even shorter, but I do not recommend it as the first Marquez book you read!)

[1] It could be the first 10 or 30 pages, it's been several years, but I am certain it's one of the better (and shorter) examples of his style.

[2] http://www.amazon.com/dp/0060882867

[3] http://www.amazon.com/dp/1400034922

[4] http://www.amazon.com/dp/0307389731

rjtavares 3 days ago 10 replies      
Many years later, in front of the firing squad, colonel Aureliano Buenda would remember that distant afternoon his father took him to see ice."

Best opening line of a book ever. RIP.

chimeracoder 3 days ago 0 replies      
I read six of Garcia Mrquez's stories in school - my favorite was "The Handsomest Drowned Man in the World"[0] ("El Ahogado ms Hermoso del Mundo"). If you're looking to get a taste of his writing but don't have time to read an entire book, this short story captures his style very well.

In a similar vein is "An Old Man with Very Enormous Wings" [1] ("Un seor muy viejo con unas alas enormes"), which was referenced in R.E.M's music video for "Losing My Religion[2]

[0] https://hutchinson-page.wikispaces.com/file/view/The_Most_Ha...

[1] http://www.ndsu.edu/pubweb/~cinichol/CreativeWriting/323/Mar...

[2] https://www.youtube.com/watch?v=if-UzXIQ5vw

tdees40 3 days ago 1 reply      
My favorite Marquez story is that he never used adverbs ending in -mente, so he called his English language translator (Edith Grossman) and requested that she not use any adverbs ending in -ly.
russell 3 days ago 0 replies      
"One Hundred Years of Solitude" was the only book that everyone in my family ever read, me, my wife and my three kids.

"Mr. Garca Mrquez, who received the Nobel Prize for Literature in 1982, wrote fiction rooted in a mythical Latin American landscape of his own creation, but his appeal was universal. His books were translated into dozens of languages. He was among a select roster of canonical writers Dickens, Tolstoy and Hemingway among them who were embraced both by critics and by a mass audience." from the article.

But the article doesnt begin to do the book justice. The mythology is Colombian but it all is real to the reader. It is very worthwnile to read One Hundred Years along with a literary biography of Marquez. It was a wonderful experience for me. BTW my taste is purely science fiction.

r4pha 3 days ago 0 replies      
I absolutely adore this man. I was lucky to be given a portuguese-translated copy of "one hundred years of solitude" at the age of 16. I read it back then and loved the story itself and specially the beautiful writing style. About four or five years later I read it again in the original (even though I don't speak spanish very well) and was even more amazed about the beauty of it and about how _my_ interpretation of it changed. I loved everything I have ever read from him, but I loved "one hundred years" so much I even feel ashamed of trying to use my own words to describe it.
maceo 3 days ago 1 reply      
Let's not forget that GGM was a life-long socialist and a supporter of the Cuban revolution.

He spent many years living and Cuba and he considered Castro to be one of his best friends. He was a firm supporter of Chavez, and looked forward to the day that Simon Bolivar's idea of a united Pan-America would be realized. Because of this, he was prohibited from entering the US during the Reagan administration.

As much as I love his works of fiction, my favorite book of his is the first volume of his autobiography, Living to Tell The Tale. I've been patiently waiting for news about volume 2 and 3 ever since the first one came out in 2002. I have never heard anything about these -- whether they were ever written remains a mystery. RIP to a magnificent man who brought so much pride to the people of our scarred continent.

paul_f 3 days ago 6 replies      
Can someone provide a quick summary of what was it that made Marquez so prominent? I had not know much about him at all.

FYI, if like me, you have trouble accessing the article, and using Chrome, right-click and open in an incognito window.

nfc 3 days ago 0 replies      
RIP. I felt a shock when I discovered this when I woke up. He is probably the author that has influenced me more strongly in my literary tastes. Gabo wasn't just a great writer but loved by many in spanish speaking countries. His mastery of the prose and outstanding ability on his craft are laudable but Gabo was as well a great person. A friend had the chance to meet him in the context of his PhD, I guess I'll forever envy her for that.

Being this my first HN comment (thanks Gabo for the strength) I'll go all out with a second unrelated part of the comment, less emotionally charged but perhaps more HN-like:

There are many comments in the thread about the translation of the first sentence of "One Hundred Years of Solitude". Translating is such a hard task, there is no way part of the meaning/subtleties will not be lost in it since languages are not one to one. And even if we get to pass most of the meaning keeping the flow will be so hard except for very similar languages (spanish/portuguese). What is an impossible problem for computers is one as well for translators, we can only hope they give as a tasty human take on the task. I wonder if one day automatic translations of literary works will have a "style" options to simulate different translator sensibilities or we will settle for a winner-algorithm takes all the translations.

Something more I'd like to share, I'm very curious if other people feel the same because part of my homemade theory of the language depends on this :). I'm lucky enough to read high level literature in different languages, however even if I can appreciate it, the pleasure I experience while reading Spanish is in a different level. Somehow a similar experience happens while talking, I feel more strongly bound to spanish in a very subtle way, it's not something I usually notice, just on some occasions. I only started learning languages after 8. A possible reason is how the brain gets bound to words and language when very young, another is that I have lived more emotional experiences in that language. The second hypothesis would have to explain why I do not feel that way in french even if I've lived in Paris 10 years. Obviously one case study, apply a grain of salt ;)

3am 3 days ago 2 replies      
Oh no... GGM was an underappreciated author in non-spanish speaking world (in spite of wonderful, gift translators... he was just an intrinsically difficult author to translate because of the poetic quality of his writing IMHO). Cien Anos de Soledad was one of the first non-trivial, non-english books I read. RIP.

edit: okay, removed 'really'.. I think he was underappreciated on a popular level, even though he was very well appreciated on a critical level.

noname123 3 days ago 2 replies      
Can someone tell me what is the theme of "One Hundred Years of Solitude" as applied to modern society? I read the book awhile ago and appreciated greatly the various character sketches.

Unfortunately, the literary criticism that I sought out back then at liberal arts college, focused mostly on the metaphor of the European colonialism on Latin America (industrialization of the town with the rubber-plant, and the subsequent massacre of the residents after some kind of rubber-plant revolution, consequences of military rule and violent overthrows as embodied by Colonel Buendia and circular nature of the history, Spanish colonialism past long felt after Latin America became independent).

Tbh, I'm not really interested in the whole multiculturalism and ethnic studies rehashing the white guilt trope. However, I find the obsession of the various characters fascinating, the scientific obsession of the original patriach that eventually descended into madness, Colonel Buendia making little gold fishes, the incestuous natures of the whole family, some ethereal nympho character that doesn't speak a word and then one day transcend to haven much to the horror of the venerable matriarch. What is your interpretation of the book?

jmadsen 3 days ago 0 replies      
Much of what I read of his was in Spanish, a second language for me, but even so I could see his command of language was incredible.

Things like, in "Relato de un nufrago", the story teller has emotional ups and downs each chapter - and Garca Mrquez carefully chose words that sounded emotionally up or down, giving a sense of rising and falling on waves through the whole story.

That, detected by someone whose Spanish was "solid" at best - what a joy it must be for a native to read.

KhalilK 3 days ago 1 reply      
His books were part of my adolescence, but "One Hundred Years of Solitude" was the essence of my formal education, I am sad he died but I am utterly glad he's lived.
jpdlla 3 days ago 0 replies      
My first favorite novel in spanish was of GGM, "Relato de un nufrago"(The Story of a Shipwrecked Sailor). Many don't know but the full title is actually "Relato de un nufrago que estuvo diez das a la deriva en una balsa sin comer ni beber, que fue proclamado hroe de la patria, besado por las reinas de la belleza y hecho rico por la publicidad, y luego aborrecido por el gobierno y olvidado para siempre."(The Story of a Shipwrecked Sailor: Who Drifted on a Liferaft for Ten Days Without Food or Water, Was Proclaimed a National Hero, Kissed by Beauty Queens, Made Rich Through Publicity, and Then Spurned by the Government and Forgotten for All Time.)
noname123 3 days ago 8 replies      
OT but tangential any magical realism authors to read? So far, I got Marquez, Jorge Louis Borges and Murakami. And preferably recommendation should be good to provide philosophical consolation to a code monkey worker-bee in the capitalist society.
kartikkumar 3 days ago 0 replies      
A deeply thoughtful literary great, to rank among the likes of Dickens, Cervantes and Dostoyevsky in my mind. Love in the Time of Cholera changed me, just like Crime and Punishment did. It affected me more than any other book has. At the time of my life when I read it, I felt that it spoke to my personal sensibilities. I followed that up with Memories of My Melancholy Whores, which I honestly think is his absolute masterpiece.

Gracias Seor Garca Mrquez.

ch4s3 3 days ago 1 reply      
And now we may never know why Mario Vargas Llosa punched him in the face.
dvidsilva 3 days ago 1 reply      
I'm so 'proud' to see this here, hard to think of something to say so I'll put one of my fav quotes from him:

"She discovered with great delight that one does not love one's children just because they are one's children but because of the friendship formed while raising them."

anuraj 3 days ago 1 reply      
I read Marquez's master piece '100 Years of Solitude' in my native language Malayalam 25 years ago and it got etched into my mind forever. In the next 2-3 years I read almost all the works of Marquez available in English. Later a lot more Latin American Authors including Borges, Huan Rulfo, Carpentier, Manuel Puig, Fuentes,Cortazar, Paz etc. became popular among the reading public in my region, but GGM was the one who started it all and remains one of my favourite writers of 20th century. Have a feeling that Marquez' did his best in the short story genre - despite his reputation as a novelist.

It is a coincidence that one of the first magical realist novels of my mother tongue - 'Khazakinte Ithihasam (Legends of Khazak)' also got published almost at the same time as '100 years of Solitude' was being published in Spanish and both works present highly resplendent and almost spiritual language and journeys (almost untranslatable).

Opening line of 'Legends of Khazak'- 'When the bus finally reached Kooman Kavu, the place did not seem unfamiliar to Ravi'.

interpares 3 days ago 0 replies      
Here is a great interview with him in The Paris Review [1]. I love when he says,

"It always amuses me that the biggest praise for my work comes for the imagination, while the truth is that theres not a single line in all my work that does not have a basis in reality. The problem is that Caribbean reality resembles the wildest imagination."

I know the Caribbean very well and could not agree more.

[1] http://www.theparisreview.org/interviews/3196/the-art-of-fic...

mcguire 2 days ago 0 replies      
"In his novels and stories, storms rage for years, flowers drift from the skies, tyrants survive for centuries, priests levitate and corpses fail to decompose. And, more plausibly, lovers rekindle their passion after a half-century apart."

"Less plausibly"! "Less plausibly"!

Geeze. Someone needs to teach these goobs to write.

Myrmornis 3 days ago 0 replies      
The spanish department at Princeton was kind enough to let me take a Spanish class while a visiting post-doc. It was great for learning Spanish. But it was so, so painful to see the sorts of pretentious bullshit that the undergraduates were inspired to produce by reading pieces by Marquez, and other South American authors writing in the "magical realist" style. I remember enjoying "Love in the Time of Cholera", and I am sure Marquez himself was great, but in general that sort of crap is exactly what you don't want your children wasting their time studying at university, and potentially misdirecting their professional lives thereafter through an underapprecation of the fact that there is aesthetic beauty in actual real stuff and facts about how the world really works.
deckardt 3 days ago 1 reply      
This is one of the reasons I keep reading Hacker News. It's a great source for cutting-edge tech news; more importantly, it's also a great source for important news.
jortiz81 3 days ago 0 replies      
Many people in the US, when asked about Colombia, think of negative things; drugs, etc. Also, down in Colombia, the Caribbean coast was often looked down upon by those from the capital -- seen mainly as uneducated people with strange customs and a different way of talking. Gabriel Garcia Marquez uplifted not only the image of Colombia world-wide, but also the image and culture of the Caribbean coast. I am proud that my family is from that region and he made me proud to tell the world that I am Colombian. May he rest in Peace.

Also, I think this quote (from an article in NPR) sums up why he's so admired in Latin America:

"Garcia Marquez is speaking about all the people who are marginal to history, who have not had a voice. He gives a voice to all those who died. He gives a voice to all those who are not born yet. He gives a voice to Latin America."

Narretz 3 days ago 0 replies      
Coincidentially, a few weeks ago I put Marquez on my to-read list. Bein a native German, my first choice was naturally the German translation. English wouldn't be a problem though, so I wonder which one I should choose. Granted, in the end the books shouldn't suffer a lot from translating.
maceo 3 days ago 1 reply      
In his autobiography he tells a story I love.

While writing 100 Years of Solitude he listened to The Beatles' A Hard Day Night album on repeat. After the book was published he received a letter from a group of Mexican college students who asked him if he was listening to A Hard Day's Night when writing the book, because they felt the album in his words.

loladesoto 1 day ago 0 replies      
Do not allow me to forget you Gabriel Garca Mrquez

(we won't forget you)

camus2 3 days ago 0 replies      
As a programmer and a poetry/book lover, it is sad news, plrease have a 5min break from whatever code you are writing(on your free time of course) to check this author out!
rafaelvega 3 days ago 0 replies      
I once met this american guy who told me in fluent spanish that he went and studied the spanish language after reading one of GGM's books because he wanted to read it in it's original language.
iraikov 3 days ago 0 replies      
His writing was like poetry and song, all in one.Second best opening after "A Hundred Years of Solitude":

"It was inevitable: the scent of bitter almonds always reminded him of the fate of unrequited love. Dr. Juvenal Urbino noticed it as soon as he entered the still darkened house where he had hurried on an urgent call to attend a case that for him had lost all urgency many years before. The Antillean refugee Jeremiah de Saint-Amour, disabled war veteran, photographer of children, and his most sympathetic opponent in chess, had escaped the torments of memory with the aromatic fumes of gold cyanide."

jseip 3 days ago 0 replies      
100 years of solitude will stand as one of the world's greatest literary works for the next ~1000 years. RIP GGM
psantacr 2 days ago 0 replies      
Gracias por todo Gabo!
Employee Equity samaltman.com
457 points by dko  2 days ago   328 comments top 57
jstrate 2 days ago 12 replies      
I've worked at two startups, including one YC. Both were acquired by larger tech companies. I was employee #3 at one and rebuilt most of a broken codebase in the other. I got nothing out of either WRT options. I agree with the author on point 4 but I don't think more options are the answer, I should have just asked for a higher salary I would have been better off. Startup-bucks are even worse than a lottery ticket, because of tax complications and money required to cover strike price.

Now I work at a large tech company in SV and wont be involved in another startup unless I'm a founder.

birken 2 days ago 6 replies      
The problem with the 10%/20%/30%/40% thing is that if the company shoots way up in value, an employee could theoretically be fired after two years and not capture much of the value they helped to create. It also doesn't necessarily need to be malicious [1], sometimes companies change and a person's skills aren't as valuable anymore.

If I were a prospective employee I would never take a deal like this, because it is really difficult to have that much trust in a company and founders that you likely don't know that much about. I can't say the standard 4-year vest with a 1-year cliff is the most optimal situation, but from an employee perspective it is way better than 10/20/30/40.

1: Though it could be, I know there was a story about something happened at Zygna like this

skrebbel 2 days ago 6 replies      
Completely off topic, but I'm this post made me realise that Sam Altman went from programmer to enterpreneur to financial guy. This post has very little ado with what he once started doing. He's a partner (and president) of an investment fund now, a pretty odd career move once you take the pink Silicon Valley glasses off.

This entire post is about finance. Not about business, not about products, not about customers, just finance. Personally, I understand just about half of the entire post.

To be clear, I don't think this is a bad thing. I envy Altman for understanding this (and for running YC at an age younger than mine, but that's another thing). But that's not my point. What I wonder about, is whether this is inevitable for successful enterpreneurs.

Is the path programmer->enterpreneur->finance the obvious one? Sam's path might've been odd, given that his startup wasn't the next Facebook, but you see the same in startups that are the next Facebook, such as Facebook. Zuckerberg used to be a PHP hacker and now he's this NASDAQ CEO. I'm not sure about Drew Houston but all I read about Dropbox recently were acquisitions.

Does growing business make you a finance guy, or do you need to be somewhat of a finance guy to grow a business? I'm really curious which is the chicken and which is the egg here.

x0x0 2 days ago 3 replies      

   The best solution I have heard is from Adam DAngelo at Quora.  The idea is    to grant options that are exercisable for 10 years from the grant date,    which should cover nearly all cases 
That is an awesome idea, and really classy on Adam's part.

andrewfong 2 days ago 6 replies      
I've been thinking of putting together something simple to analyze employee option paperwork and add some plain English annotations to help employees understand exactly what they're signing. Based on my experience, there's something like 5 or so templates that cover 90% of the startups in the valley, so shouldn't be too hard. Is there any interest in something like this?
DanielRibeiro 2 days ago 0 replies      
Great post by Sam. For employees, I'd also refer to Alex MacCaw's An Engineers guide to Stock Options[1]. Alex used to work at Stripe, and at the end of his article he shares some intersting bits of stock tax alternative not covered by Sam:

If you cant afford to exercise your right to buy your vested shares (or dont want to take the risk) then theres no need to despair there are still alternatives. There are a few funds and a number of angel investors who will front you all the cash to purchase the shares and cover all of your tax liabilities

And he goes further:

If youre interested in learning more about financing your stock options then send me an email[2] and Ill make some introductions. Ive set up an informal mailing list, and have a group of angel investors subscribed who do these kinds of deals all the time.

[1] http://blog.alexmaccaw.com/an-engineers-guide-to-stock-optio...

[2] the link is to alex at alexmaccaw.com

lpolovets 2 days ago 2 replies      
This is a great post, and I agree with almost everything Sam wrote. I think problems #1 and #4 are unfair (you might get less than you deserve, or less than you thought you were getting), but problems #2 and #3 are extremely unfair (you can't take what you've earned with you if you leave the company, or you have to pay taxes on something that has no liquid value and might not have any value in the long run).

I'd love to get Sam's (or anyone else's) thoughts on the 10%/20%/30%/40% 4-year vesting schedule that was mentioned. I don't like this schedule for two reasons:

1) It creates larger discrepancies in what employees earn over time relative to each other. If employee #1 joins today and gets a 2% grant, and employee #20 joins in 2 years and gets a 0.2% grant, then in year 3 of the company, employee #1 will vest 30x as much as employee #20, instead of 10x with the current 25%/25%/25%/25% scheme.

2) This scheme seems to replace and/or ruin refresher grants. Currently, if you do a good job, you get refresher grants every year or two. With the 10/20/30/40 system, you're already getting higher and higher compensation over time, regardless of performance, and the bump from refresher grants while you are vesting your original grant becomes minor. Furthermore, the drop from what you vest in year 4 to what you'd vest from just refresher grants in year 5 becomes much more dramatic and much more likely to push someone to look for other work.

What do others think?

mikepurvis 2 days ago 0 replies      
I'm curious about this bit:

"It causes considerable problems for companies when employees sell their stock or options, or pledge them against a loan, or design any other transaction where they agree to potentially let someone else have their shares or proceeds from their shares in the future in exchange for money today."

What are the problems with these schemes? I'm presently employee #1 at a startup, and 99.9% of my present net worth is tied up in illiquid paper therethe rest is a 10 year old station wagon and some Ikea furniture.

I'd really like to be able to pledge my options for a loan to buy a house, so I'm curious to know the issues which may arise from such an arrangement.

mikeleeorg 2 days ago 1 reply      
Very interesting. I like this train of thought. I have a lot of developer friends that would rather (and are) pursuing their own entrepreneurial ideas than join an existing company. While I wholeheartedly support that, the flip side is fewer startup-savvy developers available to join other startups.

There are a lot of reasons why they are pursuing their own ventures. A common one is: "It's not worth it to be an employee of a startup. You need to be a founder. (Or maybe employee #1-5.)" You may disagree with that belief, but it's certainly a belief many hold. Sam's suggestions may take this reason off the table.

diziet 2 days ago 5 replies      
It's quite difficult to compete with Google and their revenue/cash hordes when it comes to salary / total comp. Especially if you price the options at the last round's price and discount them some more.

Imagine a well to do company of 2 founders (in SF/Bay Area) and a team of 3-4 others that raised a seed at 10m cap. They want to grow their team headcount to 15 and are busy hiring, running servers, etc. They can offer a 100k salary (more than enough to live on) to a sort of senior engineer or PM and want to compete with Google on total comp. Let's say they need to make up the other 100k difference in comp & salary with options. Over 4 years, you're looking at a 4% equity chunk to one employee, the 6th person joining the company.

Not that I think numbers in line with this aren't realistic (I do agree with Sam that more generous equity grants are better), but for most companies that make a 15% option chunk for employees it's difficult to rationalize a number like that.

Edit: Also, that puts the equity comp of that 6th employee (or 10th, because in most cases you will have a similar equity bracket for those people) at about 1/8th of the founders, not the 1/200th that Sam mentioned. I wonder how many people have made offers to employees with a similar comp plan.

sparkzilla 2 days ago 0 replies      
As a founder I looked into paying vendors/employees with options, but have found they are too brittle. Because option deals are created at the start of employment they require a lot of faith on the part of the founder, who does not know the employee's abilities or temperament. Options do not track well with performance and cannot be adjusted easily. I also do not want to be in the position of considering terminating an employee because they have more options than what I think they are worth, and employees should not have that fear either.

Instead I am working on giving vendors and employees a convertible note that is based on their performance month-by-month. Let's say an employee or vendor is taking $5000/month less than they should be because it's a startup. The company credits them $5000 to their note each month (this can be more if there's a risk premium), and adds any performance bonuses as well as they come up. This lets management clearly track performance against the shares they are giving, and lets the employee know that if they work more they can get more. As time goes on the value of the note increases and the employee can converts their note to shares at the current valuation (or a discounted valuation).

This seems a lot more flexible to me than options, and is less stressful for the founder and the employee. Am I missing something?

philovivero 2 days ago 2 replies      
I worked as one of the very early founders of Digg. I bought my options. Obviously they're worth nothing, yet I owe the IRS about $120k. This threatens to destroy all my savings, retirement, and credit for 10 years.

ISOs are not only worthless 95% of the time, they're also actively EXTREMELY DANGEROUS 50% of the time if they're not simply worthless.

My suggestion: get a salary, and buy just-IPO'd stocks from companies you believe in.

If you find yourself ready to buy some ISOs, I further recommend you IMMEDIATELY sell them, as in have the buyer sitting there with you as you purchase the ISOs, and do the trade instantly thereafter. Take the short term capital gains hit. Do not hold onto them no matter what any CPA or tax attorney tells you unless they can talk at length about ISO+AMT Tax Trap and assure you you cannot possibly have that happen to you.

jboggan 2 days ago 0 replies      
I'm going to be in a position soon to start hiring people and I've been thinking long and hard about this. I do think that engineers tend to get the short end of the stick when it comes to options, even when the nominal percentages sound good. I can think of friends who were early engineers at "successful" companies that took an awful long time to see any real money, let alone the vast majority who get nothing.

I'm seriously considering a profit sharing / options system where options are only vested in quarters that are unprofitable and profit sharing occurs otherwise. I know that this wouldn't be different at all for many start-ups that have little chance of profitability early on, but for those that do it could be a very interesting way to align interest and not screw the employees.

gibybo 2 days ago 0 replies      
Vesting options at a startup are really like second-order options. If they were granted to you immediately they would just be ordinary options: you have the option to buy the stock at the strike price. However, since they must vest over a period of time in which you are sacrificing a higher salary, you are also given the option of whether to continue vesting those options (by staying at the company) or not (leaving the company).

The second-order option is what makes them valuable. Most startups either grow aggressively during those 4 years or they die. If they fail early, you don't have to sacrifice much salary for the now worthless options. If they are doing well, the options are now worth much more yet you are still only sacrificing the same amount of salary for them.

The problem is that the value of this presents a direct conflict between the company and employee. When the value of the unvested options grow, the company can reduce the unvested amount (or fire them if they don't agree)[1] because it will be disproportionate to the value the employee is providing. Note that they don't actually have to go after the unvested shares to recapture this value. They can go after any other form of compensation they are providing since it will still be more than the employee can get elsewhere. Essentially, this means the employee's upside potential is severely limited. Since the value of a share in a startup is based almost entirely on a massively higher future value, this tremendously reduces the value of typical startup vesting options.

If I worked for a startup I'd want straight equity. Find the value of the common stock and pay 10-30% of my salary in common stock. The amount of shares will float as the value of the company does, but this is required in order to keep incentives aligned. I'll pay the tax out of my salary (at ordinary income rates). If the company succeeds, almost the entire value derived from the equity will still be taxed at capital gains rates.

[1] See Zynga, Skype, and probably many others we never hear about.

brudgers 2 days ago 1 reply      
Altman's post suggests that the context needs changing. I suspect it needs changing to keep up with some of the very changes YC has wrought - changes to VC and the creation of startups and the options available to the sorts of employees startup founders need.

The issue is that the new startup culture has diversified power and our concept of 'business founder' is out of date. A software company founder is not the analog of a white shoe law firm partner. A personal realtionship with Jeff Bezos isn't why people buy toasters from Amazon or host their SAS on AWS, because it's not some Rolodex full of 30 year of golf course relationships and keeping the jobs of bureaucrats secure that make it rain. "On the internet nobody knows you're a dog.* [1] Or cares that you're a founder.

While I agree with Altman that something needs to change in the direction of making employee's richer ,I think he probably doesn't go far enough. The problem isn't so much tax code as capital structure and the rigidity of company structure that results.

A key hire is a key hire because it changes the company. Ideally, a company would change it's structure to reflect that change. Ideally, a company's capital and corporate structures would be agile as in development.

Key employees are just as exposed to the 'you can be a founder' meme as everyone else, and they're in a better position to pursue it than most. A founder shouldn't expect talent to hang around making them rich. In terms of game theory, I think of it as a founder's dilemma. Altman's piece suggests YC might be seeing it too.

In the current context, a founders's 30% of a $40,000,000 exit is better than even a 1% employe share of a $1,000,000,000 one - much better perhaps than the numbers would suggest because 30% gets a seat at the table, and that old Mark Cuban idea of looking around the table? Well if you're not at the table, the worst case is you're just dead money picking up the tab for someone's boat payment.

[1] http://www.paulgraham.com/hiring.html

ChuckMcM 2 days ago 1 reply      
I am a fan of giving options every year with a performance multiplier. That way the high performers are rewarded with more options and your available options are more accurately divided amongst the employees who have made the most impact.

When you are not yet cash flow positive as a startup you can give 'bonuses' in options rather than in cash.

I don't know if we could figure out a portion that employees could contribute to additional investment rounds if they wanted to take some money off the table.

pyrrhotech 2 days ago 0 replies      
the real villain here are the VCs and to some extend YC for promoting them. VCs are the ones who perpetuate the myth of "work 80 hours a week for a startup at 50% market rate and you'll be rich in 4 years". In reality, they take all the preferred stock so that even if the company sells out for double or more what it was worth when you join, you end up with nothing. I've worked at a startup that sold for 4x what it was worth when I joined, and I still ended up with nothing. A couple guys who had been there longer ended up with a few thousand dollars. What a scam!

Work at a large, established organization and earn your fair market rate at a 40 hour work week, and start your own company on the side if you want to get rich folks. I'd never work for any startup again unless I was the founder.

rdl 2 days ago 1 reply      
I don't think the 4/1 aspect of vesting is a particularly big problem. If you are enjoying your job at 4 years, the job has probably changed substantially, and you can renegotiate for a refresher grant.

I don't see any problem with restricted stock pre series A, when equity is the biggest consideration for employees. As long as financing is notes, the common hasn't yet been priced, so you can just use a very low value.

Willingness to issue refresher grants is easy for CEO and board to change.

I don't think you need to be as open as buffer, but being open with percentage ownership and financials seems obvious.

RSUs with a performance modifier already cover most of this for larger companies. Something like that for startups probably wouldn't work since so much of the risk is company-wide vs. individual.

danbmil99 2 days ago 2 replies      
Has anyone had experience with "early exercise" of (non-ISO) options? As I understand it, this strategy lets you treat them for tax purposes as if you bought the underlying stock, meaning no tax liability at vesting or exercise, and capital gains are all you pay at final sale.

The downside is you have to pony up for the full strike price of all the shares at hiring. Works great if the company valuation is still nominal (ie before a 'valuation event' such as series A, though there may be cap note seed investment already)

One could imagine a company offering a hiring bonus that covers the cost of early exercise (padded for expected tax loss).

Maybe the real problem is this shit is complicated. Then again, we're programmers, right? Don't we do complicated by nature?

leccine 2 days ago 0 replies      
I have calculated my hourly rate including the money I would get after the IPO with 40USD share price and it came out around 100 USD. This is extremely sad given that I am senior engineer, imagine what somebody in a lower paid position gets. I think generally speaking, it is not worth it to work for 12 hours a day for a startup and get 10K shares over 5 years. If you actually work 8 hours and in your spare time doing a side project you might end up way better. You could get the ideas from the 4hour work week book.
awicklander 2 days ago 2 replies      
There's another option that people never seem to talk about. Treat people well, give them a good working environment, and give them a fair salary based on the fact that they don't have any equity.

Most engineers I know with stock options and a discounted salary would have been much better with a higher annual salary and no stock options at all.

zck 2 days ago 0 replies      
There's another effect of the ten-year exercise window.

Remember how Facebook was "forced" to go public because so many people owned stock? (http://www.businessinsider.com/why-the-sec-will-force-facebo...). Well, if there's a ten-year exercise window, some of the people will hold their options and not exercise them. My -- albeit limited -- understanding of the situation is that those people are not counted as stockholders. They have options, not stock.

So the ten-year exercise window is also good for the startup, because it delays the time until the startup has to publicly disclose its financials.

aetherson 2 days ago 2 replies      
I don't understand why options are taxed at exercise. You aren't getting money out of the transaction. If you have an option to buy a share at $1 (when the share is valued at $10), and later you sell at $50, why isn't the tax treatment just that you have a $49 capital gain? Why do we instead do a $1 -> $10, and then a $10 -> $50 tax thing?
johnrob 2 days ago 1 reply      
The easiest would be if the IRS would agree to not tax illiquid private stock until it gets sold, and then tax the gain from the basis as long-term capital gains and the original value as ordinary income.

I think employees would be more than happy to treat all of this as ordinary income, if that would make it more appealing to the IRS.

porterhaney 2 days ago 3 replies      
Adding to Sam's post I'd like to see employees made aware about tools like 83(b) elections to decrease their tax liability.
runT1ME 2 days ago 1 reply      
>Perhaps the best way to think about it is to try to come up with a total compensation package with the same expected value (using the company valuation of the last round, or a best-efforts guess if its been a long time since the round) as the employee would get at a big company like Google

Am I missing something or is this saying people should be offered an 'expected' equal compensation package to what they would get at Google? What would the incentive be? Google is a company with quite a bit of projected longevity, career progression, and very good perks. Why would I choose a startup with inherently greater risk for only the same reward?

semerda 2 days ago 0 replies      
Mary Russell & Chris Zaharias are trying to do that here http://stockoptioncounsel.com/ with a bill of rights endorsement by educating folks on stock options and their rights. There are all sort of clauses and tax implications around given options that confuse people. Most end up believing the % they got will make them a millionaire.

This is a great opportunity for Freakonomics to dig into the state of stock options in startups.

When I was in my 20s I was more gullible by all the talk of stock options and becoming a millionaire from them. However I never stopped investing in property and after 10 years I am happy I continued investing into tangible assets that I was in control of. Stock options is a lottery at best. And as you get older, and learn the value of money and your time, you see the opportunity costs clearer.

As a side note, I've been through an IPO and fed all the brain wash leading up to it. Reality is always far from the dream. Many people don't like to talk about their failures only successes hence you hardly ever hear about this.

Now saying all that, there are the minority that strike it rich either by being an early employee of a startup that goes big (small % of something large) or are a founder of a successful startup when the stars align.

Employee compensation in startups will need to change as more folks start to realize the opportunity costs.

My word of advise, invest in yourself and stuff "you are in control of".

jbkp 2 days ago 4 replies      
You know, it's funny, I read things like this from time to time: "so if I have 0.5% of company and it gets acquired tomorrow for $100 million dollars, will I get $500,000?" and I remember that I am in this exact scenario, and have no idea what the answer is. I've been an employee at a startup for 2 years now. I joined when I was young, naive, and broke I don't even remember if I read the paperwork before signing it.

Does anyone have any advice for how to go about learning more about employee options? I realize I sound dumb, but better late than never.

Some questions I've always had but have been too afraid to ask:

- How does one exercise their options?

- What taxes are there and when do you have to pay those?

- In the above scenario, what factors are involved in me actually getting that $500k?

- What questions aren't I thinking of because I don't know enough about any of this? For example, I've never asked about my options since signing the paperwork: was there something I would have had to do already that I haven't, and will likely screw me in the future?

P.S. Throwaway for anonymity (because I am embarrassed to have to ask!).

mschaecher 2 days ago 0 replies      
Back-weighting seems backwards to me, especially for early employees. They receive less options for the risky, earlier stage and more options for once things are stable and proven. Most startups won't even make 4 years, and therefore early employees who take that risk can end up with almost nothing if a sale or IPO occurs in, say, 18 months after starting employment.
emocakes 2 days ago 0 replies      
I worked at a startup, was employee number 4, and the 2nd lead developer after the CTO, I got offered a pathetic 0.025% over 4 years. Options like that are disheartening and really don't make you want to stick around for 4 years getting paid dirt to eventually be able to claim your $20k worth of options.

I left and now am getting paid close to triple my old salary with options getting close to 10% in a business model that is far more profitable than the previous. I think lots of people just starting out in the startup scene get taken advantage of and taken for a ride.

filmgirlcw 2 days ago 0 replies      
This is a fantastic article.

Sam is dead-on that the current situation isn't fair and often offers employees little to no information about how the options work.

The 90 days to exercise thing is a real bummer -- for lots of reasons. As Sam says, not every employee is in a position to relinquish that kind of money for the options and taxes. I would say that if you are looking to go someplace else, depending on the size of the company and the situation, it's not out-of-line to try to value the options you won't get to exercise (or even the exercise price) into your new salary. Most companies aren't going to be willing to give you what you need to vest-out upfront, but it is a good way to negotiate either a one-time bonus or higher salary.

HowardMei 2 days ago 0 replies      
As far as I know, Huawei was the only real employee-coshared company on the planet issuing dividends attached 'virtual' stocks to their employees where virtual means stock ownership validity tied to the employment.

Engineers working in Huawei bought shares priced at net asset value with salary or bank loans and gain dividends at a yearly ROI around 17%~75%.

This unique 'communist' capital structure was created due to lack of venture capital and outside financing. It's also an experiment before China fully adopting western style corporation law.

Huawei has a complicated capital structure of founder (1.42%) + employee union (98.58%) which scared many big investors away and hindered it from IPO.

Recently, Huawei adjusted the virtual stock policy to freeze its capital structure because the structure complexity incurred a lot of accusations from the US government and harmed its growth in several major markets.

Alibaba also failed to request change of Hongkong IPO rules to apply employee-partnership to protect its senior employees.

Therefore, employee equity isn't merely about internal profit sharing or fairness at all. Investors or traditional capital markets don't like the 'communist' flavored capital structure.

Employee option is the only viable solution before some one totally disrupt the current capital market.

applecore 2 days ago 3 replies      
> Founders certainly deserve a huge premium for starting the earliest, but probably not 100 or 200x what employee number 5 gets.

When the founders started the company, their equity was pretty much worthless. When employee #5 is hired and gets 0.50% of the company, her equity presumably has some dollar value. Employee #5 gets a better deal than the founders, even though the founders have 100x more equity.

The only thing that matters is the dollar value of the equity at the time it's awarded.

jpasmore 2 days ago 0 replies      
Tax laws make this more complex than it needs to be. It would be ideal to eliminate options altogether and compensate employees with stock.

Take the market value of a job minus the amount the employee is actually paid (the startup discount) and pay the discount in stock -- common shares (VC's will be in preferred). All employees should get 2% of salary as a starting point in shares. Allow employee's to buy additional shares by forgoing comp or simply investing. Peg share price and timing of share grants to Rounds or any investment (Notes).

Perhaps have repurchase rights only if terminated for cause. Doesn't matter if someone comes in for 8 months but adds value during that period, so vesting concept is eliminated.

Would need IRS to change grant from ordinary income to capital gain type of treatment where taxes are paid when some actual liquidity/transaction occurs.

aferreira 2 days ago 4 replies      
Regarding the question of knowing what percentage of total equity your stock grant represents, most companies that are not incredibly early stage will simply not tell you.

Pushing the subject further will make you look like you're nosing around where you shouldn't, often leading to the offer being dropped (this has happened to me).

Not to say it wasn't a not-so-great company to start with, but a dropped offer is a dropped offer.

STRML 2 days ago 1 reply      
I'm starting to see companies tossing around the idea of "Phantom Stock Options"; that is, shares kept purely on paper that are never issued to the employee. Upon a liquidity event, the employee can exercise the shares and be paid their value as regular income.

This has some tradeoffs, some of them positive, some of them negative, but I am far from an expert I would love some input from somebody who knows more.

It does appear to be vastly simpler for all parties, and completely eliminates any possibility of a tax trap. However it seems to guarantee that you will be paying income tax on the sale, which can be quite sizable. And the specifics of what happens after you leave, voluntarily or otherwise, is incredibly important considering that you are never granted any actual stock.

mrmch 2 days ago 0 replies      
Would it be within the YC wheel house to provide standard employee equity agreements (just like the YC note)?
fragsworth 2 days ago 0 replies      
> startups try to have very small option pools after their A rounds, because the dilution only comes from the founders and not the investors in most A-round term sheets.

Why is this the case? If you try to align the interests of the investors with the interests of the founders, you'd find that this would put you at odds with your investors.

A company's total value might be quite a bit higher by having the ability to offer large amounts of employee options (just as an example, the ability to easily hire media personalities with a big followings without breaking your bank), which is good for both the founders and the investors.

I understand the investors are trying to protect themselves from the founders deciding to give a ton of shares to their friends (and then potentially back to the founders, in other ways), but I wonder if there is a better solution to this.

lectrick 2 days ago 0 replies      
Maybe startups should abandon the stock market process entirely and issue a new cryptocurrency instead. The founders can pre-mine whatever percentage they wish and then pay employees in part in that currency, which would be traded on an exchange the same way stocks currently are.
DavidWanjiru 2 days ago 1 reply      
The thing I try to think about in the context of me being the owner of a successful business, and not necessarily in software, is profit sharing, as opposed to equity sharing. Profit is a degenerate case of equity, in the sense that a large (albeit not whole) part of why you want to own equity is to own a share of the profit. At any rate, at the level of employee options, you want own enough equity to play the decision making role that holding equity enables you to. Beyond that, the value a market assigns to equity you own is (should be!) ultimately dependent on the profit that will accrue to that equity. At the same time, profit sharing is a lot less messy and much more rewarding to employees than equity. Sure, you're not getting a share of this asset that you've helped build, but from what I'm hearing, the story is the same with options. And profit should be easier to "give away" than equity from the founders' perspective, I think. I realize that sharing profit is complicated when businesses are in the red, but on the whole, I suspect there might be better value in the idea for all involved. Not that I have any idea about how exactly to go about sharing this profit, assuming it exists, I don't. I just happen to think it might be a more satisfactory path to take, assuming the fork on the road reads "Equity Sharing" this way, "Profit Sharing" that way.
zosegal 2 days ago 0 replies      
I think the Wealthfront Equity Plan is pretty interesting: http://firstround.com/article/The-Right-Way-to-Grant-Equity-...
ivan_ah 1 day ago 0 replies      
I wish there existed exit strategies other than IPO and being bought.

At the rate at which tech-giants are buying tech-startups, we'll end up with very few, very large tech conglomerates. I'm not sure how efficiently things run in these giant companies. More managers = more trouble and less autonomy for the lower levels of the pyramid. Central management and the pyramid are almost like communism, and we know how well that turned out...

Why can't a mid-sized profitable company pay out dividends to stock-holders?Say growth mode for the first 5 years to reach profitability, then start cutting cheques to founders, early employees, and first-round investors. I know losing cash will probably hurt the company momentarily, and stunt the growth of the business, but then you start a second round with a new pool of employee stock and new investors come it to do another 5 years.

Basically, he/she who wants to, can smooth-exit after 5, 10, or 15 years, while keeping the same company envelope, mission, and mid-sized company culture throughout the company's life. I guess this would work only for //very// profitable companies that end up with lots of cash in the bank, but if you haven't build a profitable company after 10 years what's the point?

sskates 2 days ago 1 reply      
I'll be forwarding this to our lawyer when we implement the legal paperwork on our stock option plan. We already do 1) and 4) as much as we can.

If anyone here has any ideas of how else we can be more friendly to employees with regard to equity I'm all ears.

sscalia 2 days ago 0 replies      
Great article. It should read "How not to get fucked at a startup"

This coming from someone who got bent over a barrel.

7Figures2Commas 2 days ago 0 replies      
There are a lot of things in this post that deserve to be addressed, like the fact that the 90 day exercise period for ISOs after termination is based on IRS rules, not arbitrary company policy.

But what really needs to be addressed is the fact that employee startup equity rarely produces the kind of reward that one would expect it to given the outsize attention that is paid to it. Sam writes:

> As an extremely rough stab at actual numbers, I think a company ought to be giving at least 10% in total to the first 10 employees, 5% to the next 20, and 5% to the next 50. In practice, the optimal numbers may be much higher.

It's worth testing these numbers against real-world data. For this, I'll use CB Insights' 2013 Global Tech Exits Report[1], which shows that:

1. 1,825 private tech companies exited in 2013.

2. Only 19 of them exited at a $1 billion-plus valuation.

3. 45% of exits were under $50 million, and 72% of exits were under $200 million.

If you assume that the first 10 employees receive 10% of a company's equity, and that each employee in that group receives 1%, a $200 million exit produces up to $2 million before taxes for each of the early employees. A $50 million exit produces $500,000. If you're making $125,000/year as a senior engineer, $500,000 gross after 4 years is the equivalent of what you earned in salary over the past 4 years. That's a nice bonus, but not life-changing wealth. $2 million is nicer, but if you plan to stay in the Bay Area, you might spend half or more of that on a modest house or condo.

Once you factor in the cost of exercising your options, taxes, dilution, liquidation preferences, lack of acceleration and the fact that a good portion of employees leave before fully vesting, you can see that even in a scenario where 10% of the company is given to the first 10 employees, employees aren't likely to see the type of compelling returns that Silicon Valley dreams are made of. Facebook and Twitter-like exits, where thousands of employees become paper millionaires overnight and the earliest gain tens or hundreds of millions of dollars, are the exception, not the rule.

What's worth considering further is the fact that 66% of the companies that exited in 2013 had raised no institutional capital according to CB Insights. So, as a prospective employee, in joining a venture-backed company (or a company coming out of a prominent accelerator), you may be putting yourself at a disadvantage even before you take into account the fact that employee equity is most vulnerable to dilution and liquidation preferences at these companies.

Final note: CB Insights' 2012 Global Tech Exits Report[2] shows similar trends to the 2013 report. In fact, in 2012, over half of exits were under $50 million and 76% of the companies that had an exit had not raised institutional capital.

[1] https://www.cbinsights.com/blog/global-tech-exits-report-201...

[2] https://www.cbinsights.com/blog/tech-mergers-acquisitions-de...

spo81rty 2 days ago 3 replies      
This is where having a startup outside of the valley is nice. Nobody where we are (KC) really even expects stock options. We just pay a good competitive salary and don't have to compete with someone like Google paying 2x as much. We have given some people stock incentives but because we pay well and competitively it isn't the primary compensation. The costs of running a startup are so much lower here.
d2ncal 2 days ago 0 replies      
Great article. One thing that he forgets to mention is to let employees "Pre Exercise" the options.

For a very young startup (even for Series A), the shares are still worth pennies per share, and letting employees pre-exercise the shares not only saves them from AMT but also lets the long term capital gains kick-in sooner.

Only a few startups that I've seen do this, and its really effective specially for employees.

PabloOsinaga 2 days ago 0 replies      
I totally dig these ideas - is there any consensus docs floating around we can use for our employees? and/or is anybody implementing these ideas today? ( perhaps we can borrow their docs ). Thx
dalef 2 days ago 1 reply      
Great article, but I am still not really understand some of the part of the whole picture. Can someone help here?

I am now working in a series A company, taking 0.13% of the company, 13,000 shares (options). At the other side, Pinterest offers me 30,000 RSUs which I turned down because I thought Pinterest was already a late stage company.

But after I did these researches (including this post), I am wondering if I made a right decision? my 13,000 shares will always be 13,000 shares, no matter how much dilution we have in future, right? so does it mean even if my company grew to the size of Pinterest in future, I still only have that 13,000 shares instead of 30,000 I could get from Pinterest easily with less risk?

Or all late stage startup companies have split their stocks otherwise I don't see how joining a early startup for 13,000 would be any better

practicalpants 2 days ago 2 replies      
This is probably not the right vehicle to ask 'Am I being treated fairly?', but I think I will anyways. The startup is pre Series A, I'm the first non founding/non executive level engineer, I'm technically a contractor but treated pretty much exactly like an employee (I know that's a whole separate thing), I'm not the most experienced engineer, i.e. last year at my prior job I was an intermediate level but this year I would be considered senior at most organizations, I get a decent hourly rate, it's 95% remote, and my equity percentage is... .25% with four years of vesting.

I could be wrong, but I've come to the conclusion that after dilution and taxes, any thing short of a billion dollar exit isn't going to be compensatory for my efforts. I don't know how correct my conclusion is, and whether I should try negotiating for more.

mathattack 2 days ago 0 replies      
I've seen companies strategically fire people to get out of option awards. Or grant very generous options, only to plan on firing the folks later. Very shady business.

I've become a bigger believer in cash. Unless you TRULY believe the vision.

bankim 2 days ago 0 replies      
Kudos for a post focusing on startup employees and not founders!
joewallin 2 days ago 0 replies      
Congress should change the law so that the transfer of stock to workers is not taxed. I am not sure why pro-worker legislation like this wouldn't be supported.
logfromblammo 2 days ago 0 replies      
I can only speak for my own experience, but everyone I have ever known has always been screwed by options. As such, I automatically assign a value of $0 to any options attached to an employment offer. You can pretend that yours are worth more thanks to your unique structuring as much as you like, but thanks to everyone else in the industry, you will still have to convince your employee that you are not just spewing delusion at him.

While I can't prove it, I believe I was once fired just to prevent my options from vesting.

As an employee, you're really better off with zero options and a higher salary 99.9% of the time. But that means the owners have to sell more of their equity to make payroll.

If you want to be a nice guy and keep the early employees eligible for big payouts, take your share of the buyout/IPO and give them bonuses out of that. No one trusts the option plans any more.

bambam12897 2 days ago 0 replies      
I wonder what the author thinks of ESOPs and cooperatives.
derekrazo 2 days ago 0 replies      
You could run your start up as a co-op.
ironhide 2 days ago 0 replies      
You either own the company or you're nothing.
Postman: a powerful HTTP client (for Chrome) to test web services getpostman.com
387 points by caio1982  1 day ago   138 comments top 52
stock_toaster 1 day ago 2 replies      
I prefer cli tools for api driving/testing, and httpie[1] works pretty well for that.

[1]: https://github.com/jkbr/httpie

te_chris 1 day ago 5 replies      
I use Rested, it's a simple, cheap, mac app and works great. http://www.helloresolven.com/portfolio/rested/

EDIT: Why the hell was I downvoted for suggesting a good app? goddammit...

mikegioia 1 day ago 1 reply      
I auditioned about 10 different Chrome/Firefox extensions to send HTTP requests and this was by far the best one. It's clean, simple to use, and handles auth really well.
a85 1 day ago 7 replies      
Thanks for the great comments everyone. Postman developer here. Woke up to find the link at the top of HN. Feels awesome. :)
Osiris 1 day ago 1 reply      
I've been using Postman for quite a while now and it's really handy. You can save requests in collections so you can rerun the same requests for testing purposes. It's great if you're deving a REST API and need a simple client for basic testing.
Garbage 1 day ago 1 reply      
I am surprised how many people didn't know about Postman.

It's an amazing tool. I have been using it almost daily since long back.

hdra 1 day ago 7 replies      
Would be perfect if it can stop being a chrome packaged app though. The window switching in OS X makes it really hard for me to get to it when I have a chrome window open at the same time.

I wish it could be wrapped in a native app or even just let it run in a chrome tab.

ninjakeyboard 1 hour ago 0 replies      
I've been using this for a while. I usually go to curl if I need to do any really heavy work but it's a handy quick tool.
namigop 1 day ago 0 replies      
{ Full disclosure, I'm the author of WCFStorm.Rest }

If you're on Windows and is looking for an alternative, check out wcfstorm.rest (http://www.wcfstorm.com/wcf/learn-more-rest.aspx). It has a lot features similar to postman and adds some more, like saving requests and responses into functional test cases which includes a showing a graphical "diff" between the actual and expected responses as well as being to define custom validation rules that is executed against the http response. It can also do a single load test as well a distributed load test using several machines (http://www.wcfstorm.com/wcf/how-to--distributed-performance-...)

WcfStorm.Rest is a paid software but it has a LITE version which is available for free and works well for ad-hoc testing and exploring REST API endpoints.

tzury 1 day ago 0 replies      
Great tool, added to toolchain.

If you never came across Firefox Tamper Data Add-on, this might be a great opportunity to mention it.


A different tool, yet targeting similar audience (us).

lnanek2 1 day ago 0 replies      
I use Simple REST Client for Chrome. This might have a few extra features, though, so guess I'll try it out. Seems like a common, over served extension idea, though.
techaddict009 6 hours ago 0 replies      
I had tried many other similar tool but all of them started crashing my chrome but Postman didnt cause any issue to performance of the browser.

I will suggest Postman for sure.

Ind007 1 day ago 0 replies      
I find Advanced REST Client is more friendlier


SixSigma 1 day ago 2 replies      
"no more fiddling with the command line"

If you have to fiddle with the command line, you're doing it wrong.It drives us nuts in the Plan9 community that Bash history and readline is seen as some sort of productivity tool. We have powerful shell primitives but the command line is seen as the last resort of composing them. Admittedly we have a terminal window with which you can edit text in two dimensions but use a proper set of tools with a bit of forethought and you get much more done.

If you need an oauth client, write one with a few bits of script and use it everywhere. It's the Unix way.

politician 1 day ago 0 replies      
Postman is an amazingly productive way to test or explore an API. It's a cheap win to export a request collection, and stick it in version control to capture the known state of an API at a point in time.
greenbee 1 day ago 1 reply      
I've been using Postman for a while. Overall, very impressed as it was intuitive straight from the first use.However, when putting files in a post request, it doesn't remember the files for you. Also, there is no way to stop an ongoing request besides resetting and clearing all the parameters. I really hope they would accommodate these options in the future.
ams6110 14 hours ago 0 replies      
chandraonline 1 day ago 0 replies      
POSTman is an awesome testing tool and also a great way to document APIs to share within a team. I wish it had support for storing information in responses as variables for use in later requests. I currently use globals within environments to set things that rarely change like API keys and secret but it doesn't work for things that change in every invocation for eg., it would be nice if I can save the access token of an oath call in $token and then refer only to $token in other protected resources.
atonse 1 day ago 0 replies      
I love postman but am not a fan of the just slightly weird behavior of chrome apps.
cl8ton 19 hours ago 0 replies      
Postman is exactly what I have been looking for. I used Fiddler for years and always thought it to be heavy and awkward to use. It's strange I never heard about Postman before this post.

I like working out of the browser and Postman has just the right amount of features I need.

jug6ernaut 1 day ago 5 replies      
I know i am missing something here....but how on earth do you launch this?
jroseattle 1 day ago 0 replies      
I use both postman and restconsole alternatively for testing our JSON API. Wrapped in with the console open to the Network, it's indispensable.
andeh89 1 day ago 2 replies      
We've been using Postman for a while now, it's invaluable for testing/validating APIs across our environments. One feature we've been waiting a long time for however is live collection sharing; to have a centralized set of endpoints/etc that's kept updated and synchronized (without needing everyone to export and import JSON files). There's a feature request on GitHub but it's been in stasis for a long time now.
nodesocket 1 day ago 0 replies      
Postman is an awesome tool. If your intereseted in a php library to go a bit further and curl urls and expect certain json response bodies, headers, and status codes, check out https://github.com/commando/dogpatch.
hasancc 1 day ago 0 replies      
I've been using it for quite a while now. I'm ashamed to say that it's simplistic approach on collections often made me question the urgency of writing API documentation when developing a private API. Why not just send the collection of examples? Well, we all know why but still...
alex440 3 hours ago 0 replies      
I tried postman and found it rather weird.What did not suit me: problems with remembering history - i'd expect the much labored-at settings and vars and options to be stored in the cloud and not vanish if I re-install the extension or use other machine. no custom cookies - that's a deal breaker.

I prefer the excellent Fiddler tool for manual fiddling, and yet have to find a good test-suit tool. Or write it myself...

free 1 day ago 0 replies      
I switched to Postman because it supports attaching files to make multipart request, which the RestClient does not.
deanmoriarty 1 day ago 0 replies      
And let's not forget is the only one I've seen that automatically remembers cookies, very very handy.
idlemind 15 hours ago 0 replies      
It's a great tool, we use it all the time on our team. It's the easiest way I've found to share a full demonstration of the API all in one go, with documentation. Couldn't live without it now!
lilpirate 1 day ago 2 replies      
Postman works great. The only feature I would like to see added is to somehow easilu swap a base URL in requests. So I have to maintain just one collection per API and not multiple collections for multiple API deployments (production, testing, staging).
uuid_to_string 1 day ago 2 replies      
Our current testing utilities are relatively small and compile with modest amounts of RAM. Hobbit's netcat, WR Stevens' sock, etc.

We have security auditing rules that require us to compile the programs we run from source. We prefer small programs as they are easier to audit.

Does anyone know how much RAM is required to compile Chromium?

Can it be done on a laptop?

domrdy 1 day ago 0 replies      
I've been using this the past few days for testing elasticsearch queries, great tool!
theboss 1 day ago 0 replies      
Super useful for testing web-apps for vulnerabilities
karangoeluw 1 day ago 0 replies      
I'll recommend Postman to anyone any day. I make a ton of API's and it's the best tool for testing them.
ShaneCurran 1 day ago 0 replies      
I've been using Postman pretty much every day for the past couple of years. It's a really handy tool for testing out APIs in particular which is primarily what I use it for. It's definitely worth upgrading your package and I think the developer well deserves it.
themez 23 hours ago 0 replies      
What impress me is that far they can go as a chrome app, chrome is turning into a platform like adobe air now. It's amazing.
munimkazia 1 day ago 0 replies      
This tool is very popular in my place of work and geek circle. We've been using it for quite a while now.
rdvrk 1 day ago 0 replies      
Super useful. It's nice that you can set auth parameters (basic/digest/oauth1&2) and headers easily.

I use it as a packaged app, so launching it is the same as it is for native stuff.

sinzone 1 day ago 0 replies      
http://mashape.com is a proud supporter of Postman
vitalus 1 day ago 0 replies      
Our team's been using this for about 6 months as a regular part of our workflow. I now lean on it heavily while developing JSON APIs
pingburg 1 day ago 0 replies      
I've used Postman for a while and it is great for those trying to understand integrating with web apis. The ability to save requests in a namespace is invaluable.
reinier_s4g 23 hours ago 0 replies      
Postman is a great tool, i cant imagine developing applications nowadays without it.
threeseed 1 day ago 2 replies      
Also would take a look at SoapUI if you are dealing with multi step flows:


vtempest 1 day ago 1 reply      
I've been using this for a while. I wish it had the feature of logging the http requests going on currently, similar to other addons or Wireshark.
jayjohnson 1 day ago 0 replies      
Great tool we use during development of our REST APIs. Liking the collections for testing regressions too.
shaneqful 1 day ago 0 replies      
Or you could just use curl.
digitalpacman 1 day ago 0 replies      
Been using postman forever. But it needs lots of work.
malkia 1 day ago 0 replies      
postman turned real useful for me while getting some jira rest stuff to work
shivaas 1 day ago 0 replies      
love this tool!! indispensible if you are working with a REST api, or even debugging SOAP responses
victormx 1 day ago 0 replies      
the only flaw it's you can't see size of request without using chrome tab
forlorn 1 day ago 1 reply      
> for Chrome

Thank you but no.

They say nothing will change medium.com
373 points by jonsuh  5 days ago   116 comments top 31
gnu8 4 days ago 10 replies      
Those who are defending this as a reasonable and commonplace policy are dissembling at best. This is another example of the emerging electronic class system. Those who are members of the Silicon Valley clique are privileged to take what they want from those who are not. Recall the Googler who was able to have a web page he didn't like shut down, just by calling his connection at Digital Ocean.

One might argue that a thing such as an Instagram account is just a service provided by a business and the business can do as it likes, but this isn't the case. A social media account is a vehicle for the user to interact with the entire world, and it shouldn't be able to be unilaterally revoked, especially if the only reason is to give it to a"more deserving" insider. We need a system of due process for situations like this.

josefresco 4 days ago 3 replies      
Before we all seemingly jump on the "she should have updated her account" bandwagon, it would help to see actual evidence of her inactivity, or at least evidence/statements in Instagram's TOS that state what "inactive" really means.

This doesn't cut it: "We encourage people to actively log in and use Instagram once they create an account."

How long can I go idle before Instagram takes my account back?

Also, as someone else stated in this thread the very least Instagram could have done was to email her and inform her that she was about to lose her account due to inactivity.

Lastly, it's important to note that Instagram didn't just "prune" her account, they renamed it and gave her original account name to an employee. If they were concerned with squatters or dormant accounts they would have actually nixed the account, not renamed it to something else.

simonsarris 4 days ago 5 replies      
Oh guffaw. Even our beloved Github has a means to let you unseat inactive account names.


(And they should, I think.)

Since at least 2012 Instagram has had this in their terms:

> 4. We reserve the right to force forfeiture of any username that becomes inactive, violates trademark, or may mislead other users.

So whine about the policy if you don't like it, but don't whine that Instagram has materially changed.

steven2012 4 days ago 0 replies      
It's clear that Facebook etc employees are higher on the virtual caste system than the rest of us peons. The more and more they make this evident, the less interested regular people will be in participating in their virtual world.
brandon272 4 days ago 0 replies      
I think it's in extremely bad taste to remove an account that's ever had content uploaded to it. Person registered two years ago and hasn't uploaded anything? Sure, prune the account. User creates an Instagram account and uploads some nice family photos and doesn't sign in for a year? Leave it alone!
drgath 4 days ago 1 reply      
As others have mentioned, this was probably due to inactivity. I stopped using Instagram years ago, but still had 'instagram.com/derek', and just checked to see if that was still my username. Nope, it is now 'derek______________', just like '_____kathleen'. The fact that a FB employee now owns the 'kathleen' user means they probably have an internal reservation system for expired accounts, which is a nice perk.
lazerwalker 4 days ago 3 replies      
Whoever you think is in the wrong here, the real takeaway is a reminder that when you use a VC-funded for-profit service, you don't "own" anything.
avree 4 days ago 4 replies      
This isn't actually an uncommon policy. For example, Twitter lets (or used to let?) you take a username that's inactive for 9 months.

Source: http://sarahwallace.wordpress.com/2010/09/23/how-to-request-...

The Facebook stuff is probably a red herring here. If there was activity on the account, I bet this would never have happened.

klenwell 4 days ago 1 reply      
Worried about having someone steal your invaluable Twitter or Instagram username?

The solution is obviously to immediately litter all your social media accounts with such foul loathsome toxic content that no one else would want to touch them again for at least the next 1000 years.

jonsuh 4 days ago 4 replies      
Taken straight from Instagram's policy:

> "We encourage people to actively log in and use Instagram once they create an account. To keep your account active, be sure to log in and share photos, as well as like and comment on photos. Accounts may be permanently removed due to prolonged inactivity, so please use your account once you sign up!"

Source: https://www.facebook.com/help/instagram/294919817276863

The wording is terribly obtuse and seems targeted more for username squatters.

Would be helpful if Instagram defined what a period of prolonged inactivity is. Shady nonetheless, considering they didn't even notify her informing her that her username was revoked due to inactivity.

tensafefrogs 4 days ago 0 replies      
"A few months ago while tagging my wife"

"This morning I told her I Instagrammed a photo of our kids that she should see."

Instagram names are not domain names, and it sounds like she doesn't use the account. Most services have a clause that lets them reclaim inactive accounts after a set period of time.

ampersandy 4 days ago 1 reply      
What does this have to do with Facebook and the acquisition? Anyone working at Instagram pre-acquisition would also have been able to reclaim inactive handles.

Twitter does this as well and that's how I got my current username. There are a couple of things that are required before you could reclaim a handle (I forget the exact timespans, but it was close to this):

    * Has not logged in for one year    * Has not tweeted in a year and half    * Does not any have applications linked to Twitter
If Instagram has a similar policy, I really don't have any sympathy that the username was taken.

uptown 4 days ago 0 replies      
Update Apr 16, 2014 @2:34pm: Im very pleased to announce that Facebook / Instagram did the right thing and delivered my wifes Instagram handle back to its rightful place: http://instagram.com/kathleen
gordaco 4 days ago 0 replies      
This looks like an employee acting on her own, thinking (wrongly) that the account was not active and nobody would notice if she took over the username. Still, it's a disturbing issue that shouldn't have been allowed to happen, so if this is the case I hope the employee gets some penalty. And the fact that this was possible, or maybe even legal (I don't know Instagram's terms of service), doesn't make it less of a dick move.

I was an employee for a local social network with about ~10mil registered users (~5mil daily users). It was much smaller than Instagram, but despite that (or precisely because of that) things like this were completely forbidden.

raesene3 4 days ago 0 replies      
What I think is interesting about this is more the general case than this specific example. I'd say that people's social media handles are becoming more and more important to them, so loss of them becomes increasingly bad.

A lot of people in the tech world are probably more known by things like their twitter handle than their real name.

With free services (and indeed perhaps with paid for) there's not a lot to stop a company changing the ToS to allow for usernames to be transferred as they like(assuming it's not already in the ToS).

Now if your chosen handle is pretty niche (no one who's not a fan of 90's ADnD settings is likely to want mine), it's probably not a big risk, but for other ones it seems plausible to suggest that a company might start seeing them as a valuable asset, to be monetized..

owenversteeg 4 days ago 0 replies      
This happened to me (employee of the company took my username) with a different service, and a quick tweet to the founder had my account restored.

I'm not going to identify the website because the person that made it is a nice guy in general and he restored my account right after I asked.

dmschulman 4 days ago 0 replies      
This is the new norm now that these web services are no longer a niche product. Some kind of set standard for inactivity would be nice so users are aware of when they are in danger of losing their username.

I know many who set out to register popular Twitter and Soundcloud names when those services launched just so they could sit on them and possibly make a buck. Those username policies are out there to combat this kind of behavior but it's crummy to see when those policies actually affect legitimate users.

thehme 4 days ago 0 replies      
I read "..she opened up Instagram on her phone (shes not a regular on the service anymore)", and wondered, does he mean that @kathleen has not been using it for a while (months)? There is no justification for stealing a handle, but I was just wondering. I recall once I wanted to have a specific handle on twitter for an idea I had, so I contacted the owner via a private message. He/she has no tweets and to this day, I have not heard from him/her yet. Would anyone be open to perhaps having an "expiration" date on our accounts? Sometimes I feel like there are robots out there claiming every possible handle, so that, idk, they can sell it later?
dublinben 4 days ago 1 reply      
You have no ownership over a username in a private service. Your access can (and will) be terminated at any time in accordance to their Terms of Service. If you want to maintain control over your identity and presence online, you ought to use self-hosted services like Pump.io or Diaspora.
xacaxulu 4 days ago 0 replies      
When can we start taking accounts of deceased persons?
rch 4 days ago 4 replies      
Why don't new services default to using a random string as an identifier, along with an alias for display, instead of a requiring a unique username?

Managing overlapping names among friends is something most people know how to do well enough already.

yankoff 4 days ago 0 replies      
This is just unbelievable.. Have you tried to email their support? It's hard to believe that that could be an acceptable practice at any company, I would assume it's just some employee being a jackass.
reshambabble 4 days ago 0 replies      
There are two interesting groups at play here - the tech companies that own all data, usernames, etc. on their platform and the user that needs to be on their platform for the company to exist and be successful. Our private information becomes public information when we share it with some of these companies, and they are given permission to own and use the information to a certain extent. They promise us security and stability at first because they need us, but when they don't depend on us anymore they can get away with sacrificing those users who don't contribute enough in order to serve their own interests. Because what does losing one person do to them? Are we all going to boycott Instagram now? Probably not. There definitely needs to be some regulation on how internet-based businesses can use and change a user's information because our public/internet identity has become so integrated into our lives that an incident like a sudden change in username can feel like a violation of privacy (even when it's not really one).
emsy 4 days ago 0 replies      
So what's next? Facebook employees breaking into houses and stealing Occulus Rift DevKits that haven't been used for a while? SCNR
centizen 4 days ago 0 replies      
Instagram has done this in the past as well, and IIRC; before the merger. But by all means - jump on the Facebook hatewagon and take a ride.
qwerta 4 days ago 0 replies      
Is not there some 'anti hacking' law in US? Aaron S. got like 30 years for downloading a few documents.

Call FBI and see what happens.

Im_Talking 4 days ago 0 replies      
I hate Facebook and, by association, I hate every Facebook subsidiary. They are monetizing your privacy.
logfromblammo 4 days ago 0 replies      
I feel as though this would be a good opportunity to remind people that choosing an Internet handle that has any connection to your public identity name is not necessarily a good idea. The potential name space for memorable, usable, easily-typed handles is much larger than the list in the baby names book, and there is value in avoiding collisions.

As my own public name is two of the most common first names and one of the most common last names in the Anglophone world, I am not altogether unfamiliar with the disutility in using a common name.

Aside from that, in the real world, we have a host of disambiguators available to tell the difference between two individuals with similar names. There is no particular reason why a site's user handle would need to be unique. The data store should probably be keying everything on a serial ID number anyway. Just as the DNS exists to associate names with IP numbers, a handle resolver could use disambiguators as needed to minimize disruption of the user experience due to non-uniqueness.

If you log in from a new device with a new IP address, you might be asked "Which 'kathleen' are you?" once, and get a small "I'm a different 'kathleen'" link thereafter. If you're a giant like Facebook, there is probably more utility in allowing people to have short, non-unique user handles with an on-demand disambiguation system than in a system required to enforce user handle uniqueness.

geldedus 4 days ago 0 replies      
glad I have deleted my Instagram account back when the policy change scandal; this incident confirms Instagram is a BS site
fbndki 4 days ago 0 replies      
Welcome to the nightmare
bichiliad 4 days ago 0 replies      
I wonder if this is just really, really clever PR to get people to use their accounts more.
Ask HN: What source code is worth studying?
369 points by SatyajitSarangi  4 days ago   166 comments top 74
sillysaurus3 4 days ago 11 replies      
== Vim or Emacs ==

Just pick one and force yourself to use it to the exclusion of other editors. Future you will thank you later, because you'll still be using it 20 years from now. "We are typists first, programmers second" comes to mind. You need to be able to move chunks of code around, substitute things with regexes, use marks, use editor macros, etc.

== 6.824: Distributed Systems ==

http://pdos.csail.mit.edu/6.824-2013/ Do each lab. Read the discussion and rtm's course notes.

== Tarsnap ==

https://www.tarsnap.com/download.html How to write C. Study the "meta," that is, the choice of how the codebase is structured and the ruthless attention to detail. Pay attention to how functions are commented, both in the body of the function and in the prototypes. Use doxygen to help you navigate the codebase. Bonus: that'll teach you how to use doxygen to navigate a codebase.

== xv6 ==




Read the book. Force yourself to read it in its entirety. Use the source code PDF to study how to turn theory into practice.

== Arc ==


You're not studying Arc to learn Arc. You're studying Arc to learn how to implement Arc. You'll learn the power of anaphoric macros. You'll learn the innards of Racket.

Questions to ask yourself: Why did Racket as a platform make it easier to implement Arc than, say, C/Golang/Ruby/Python? Now pick one of those and ask yourself: what would be required in order to implement Arc on that platform? For example, if you say "C," a partial answer would be "I'd have to write my own garbage collector," whereas for Golang or Lua that wouldn't be the case.

The enlightenment experience you want out of this self-study is realizing that it's very difficult to express the ideas embodied in the Arc codebase any more succinctly without sacrificing its power and flexibility.

Now implement the four 6.824 labs in Arc. No, I'm not kidding. I've done it. It won't take you very long at this point. You'll need to read the RPC section of Golang's standard library and understand how it works, then port those ideas to Arc. Don't worry about making it nice; just make it work. Port the lab's unit tests to Arc, then ensure your Arc version passes those tests. The performance is actually not too bad: the Arc version runs only a few times slower than the Golang version if I remember correctly.

== Matasano crypto challenges ==

http://www.matasano.com/articles/crypto-challenges/ Just trust me on this one. They're cool and fun and funny. If you've ever wanted to figure out how to steal encrypted song lyrics from the 70's, look no further.

== Misc ==

(This isn't programming, just useful or interesting.)

Statistics Done Wrong http://www.statisticsdonewrong.com/

A Mathematician's Apology http://www.math.ualberta.ca/mss/misc/A%20Mathematician's%20A...

Surely You're Joking, Mr. Feynman http://web.archive.org/web/20050830091901/http://www.gorgora...

Zen and the Art of Motorcycle Maintenance http://www.arvindguptatoys.com/arvindgupta/zen-motorcycle.pd...

== Above All ==

Don't fall in love with studying theory. Practice. Do what you want; do what interests you. Find new things that interest you. Push yourself. Do not identify yourself as "an X programmer," or as anything else. Don't get caught up in debates about what's better; instead explore what's possible.

stiff 4 days ago 1 reply      
I think you get more benefit from reading code if you study something very close to what you are working on yourself, something in the same domain, in the same framework perhaps, or at least in the same programming language, at best something you are deeply involved in currently.

I never seem to get enough motivation to read deeply into random "grand" code bases like Lua or SQLLite, but some months ago I got into the habit of always studying a bunch of projects that use a given technology before I use this technology, and it greatly decreased the amount of time it takes me to get to a "idiomatic" coding style. So instead of diving in a random, I would recommend making researching existing code-bases related to what you are currently doing an integral part of your workflow.

willvarfar 4 days ago 1 reply      
Fabien Sanglard http://fabiensanglard.net has some excellent code reviews on his website, particularly games.

You could read some of the code-bases he reviews, and then read his review. You'll be able to compare and contrast your opinions with his, and if there's interesting variation you can blog about it ;)

robin2 4 days ago 0 replies      
Slightly off topic, but Peter Seibel's take on the idea of code reading groups, and the idea of code as literature, is interesting: http://www.gigamonkeys.com/code-reading/

"Code is not literature and we are not readers. Rather, interesting pieces of code are specimens and we are naturalists. So instead of trying to pick out a piece of code and reading it and then discussing it like a bunch of Comp Lit. grad students, I think a better model is for one of us to play the role of a 19th century naturalist returning from a trip to some exotic island to present to the local scientific society a discussion of the crazy beetles they found."

The reason this is off topic is that it sounds like you were after interesting specimens anyway. I don't have any code examples as such, although if algorithms count I'm particularly fond of Tarjan's algorithm for finding strongly connected components in a directed graph, and the Burrows-Wheeler transform (as used in bzip).

fotcorn 4 days ago 2 replies      
The Architecture of Open Source Applications book[0] gives a high level overview on many open source projects. It's a good starting point to dive into the code of these projects.

[0] http://aosabook.org/en/index.html

oneeyedpigeon 4 days ago 1 reply      
To mix things up a bit, I'm going to give two very small examples of code that can be understood quickly, but studied diligently. Both are in JavaScript, which I notice you mention specifically in another comment:

[1] Douglas Crockford's JSON parser. Worth a look because it is excellently commented and is easily understandable https://github.com/douglascrockford/JSON-js/blob/master/json...

[2] Bouncing Beholder. A game written in 1K of highly obfuscated code, which the author expands upon here. Worth it because it teaches some crazy optimisation techniques that are applicable to all programming, but also includes plenty of javascript-specific trickery. http://marijnhaverbeke.nl/js1k/

dailo10 4 days ago 1 reply      
Python Sudoku Solver by Peter Norvig -- an elegant solution in one page of code. When I read this, I felt like code is art.


davidw 4 days ago 0 replies      
I'm partial to the Tcl C code:


It's very nicely commented and has a nice, easy to read style throughout (except for the regexp files).

pcx 4 days ago 0 replies      
I've heard lots of people sing praises for Redis source - https://github.com/antirez/redis. A cursory look into the source shows a very well documented code-base. It's one of the top items in my to-read-some-day list. Salvatore is an excellent C programmer and takes a lot of pain in writing good documentation, despite his not so great English skills. A shout out for him, thanks for setting an example.
raverbashing 4 days ago 0 replies      
The Linux Kernel

Very clean (mostly) and very revised C code, following a strict code convention

(Of course it's kernel code, so some things don't apply to userspace, still)

pavlov 4 days ago 0 replies      
I learned a lot from the Cocotron source:


It's a free cross-platform implementation of Apple's Cocoa, so there's a lot of stuff there. But the project is well organized, and almost everything is written in a minimalist oldschool Objective-C style.

I've looked at some other cross-platform frameworks, and they are often hard to understand because they have been developed by a large group of developers and include lots of complex optimizations and platform-specific code paths. Cocotron is not as finely tuned as Apple's CoreFoundation (for example), but much more readable.

biscarch 3 days ago 1 reply      
Erlang: Riakhttps://github.com/basho/riakRiak is actually a layering of a few different projects including Riak KV, Yokozuna (Solr), Riak Core, etc. It was grown out of the Dynamo paper.

Haskell: Snaphttps://github.com/snapframework/snapSnap is another project built in layers (snap-server, io-streams, snaplets, snap-core). The 1.0 release makes some pretty massive structural changes behind the scenes changes with minimal breakage of the public api and io-streams is a very nice api to work with.

JavaScript: Underscore.jshttp://underscorejs.org/docs/underscore.htmlUnderscore is a utility library that gives a nice overview of various techniques in JS, such as how to handle equality, use of apply, ternary operators, etc. Many functions have fallbacks to ECMAScript 5 native functions.

SixSigma 4 days ago 0 replies      
The plan9 operating system

* The lack of ifdef's that make cross compiling a breeze

* It is easy to understand, compare to reading the Linux kernel


spacemanmatt 4 days ago 1 reply      
Please enjoy the source code of PostgreSQL (any version, but latest is generally recommended) core. It is very well factored, and typically also very well commented. This community cares a great deal about code quality, because they are so clear on the relation between readability, diagnosability, and execution correctness.
fit2rule 4 days ago 2 replies      
The sources to Lua are pretty darn great:


oscargrouch 3 days ago 0 replies      
My personal list (mostly imperative languages)

C++: (Complex software with elegance + performance )

  Dart source code  V8 source code (Same people as Dart)  LevelDB  Chrome (the only downside: too much virtual dispatch ->   "javism")

  SQLite  Redis  Nginx  Solaris and Freebsd

  Rich Hickey implementation of the clojure runtime in Java  (it was there in 2009.. maybe now this is in clojure itself??)

  The Go standard libraries

lamby 4 days ago 0 replies      
"Beautiful Code" is worth a read-through, particularly for the commentary.

(One thing that I still remember years on is the "drop of sewage" example.)

olalonde 4 days ago 0 replies      
Javascript/Node.js: pretty much anything written by https://github.com/visionmedia his less popular libraries are not very well commented though) https://github.com/jashkenas/underscore

Scheme (and functional programming in general): examples/exercises from the SICP book

AhtiK 4 days ago 2 replies      
Python => SQLAlchemy

Very clean, feature-rich yet pragmatic and well documented. https://github.com/zzzeek/sqlalchemy

projectileboy 4 days ago 1 reply      
I'd echo the advice to read the Arc source, and I'd add the various versions of Quake (C, C++). I learned a lot reading John Carmack's code.
rch 4 days ago 0 replies      
Take a look at Redis sometime. You might want to actually work on it a bit to help internalize what you're reading. Here are a couple of articles that might help get you started:



Locke1689 3 days ago 0 replies      
http://source.roslyn.codeplex.com/ for high performance, immutable C# code.

You'll see some differences from more relaxed C# projects (e.g., we avoid allocations like the plague), but I'd say we have pretty good style. ;)

agumonkey 4 days ago 0 replies      
I really enjoyed skimming through Ian Piumarta's Maru, a Lisp in C, very pretty code, very concise. (I already mentioned it in other topics)


DalekBaldwin 4 days ago 1 reply      
Honestly, aside from learning to express a few extremely specific patterns in your language of choice concisely and elegantly and reminding yourself of the existence of certain libraries and utility functions so you don't accidentally waste time reinventing them, I think reading source code is a pretty useless exercise unless you also have a detailed record of how that source code came to exist in its present form. Until there is some revolutionary new tool for generating a human-understandable narrated history of large-scale design decisions from a source control history, your time will almost certainly be better spent reading textbooks that incrementally develop a piece of software over several chapters. Even that is cheating -- the authors know exactly where they want to end up and they won't include all the missteps they made when they first started writing similar programs. But it's still loads better than the alternative. Just as sitting in a law school library absorbing an encyclopedic knowledge of the law won't really train you to make arguments that will fly in front of a judge, reading a code base as a dead, unchanging document won't teach you what it is to live in that code.
betterunix 4 days ago 0 replies      
SBCL or CMUCL -- Lisp compilers written in Lisp.
paulrademacher 3 days ago 1 reply      
Any suggestions for smaller codebases? A lot of these are great and you'll pick up idioms here and there, but they're massive.
twelvechairs 4 days ago 0 replies      
The most interesting things to read are those where a programmer has done something cleverly, but this only needs to happen when your language or libraries make it hard for you to begin with. Aside from low-level performance intensive functions, the best code is not interesting to read - it just reads like statements of fact.
agentultra 4 days ago 0 replies      
Anything you find interesting or find yourself using frequently.

A less glib answer try Brogue: https://sites.google.com/site/broguegame/

A very interesting roguelike with interesting constraint-based features.

hiisi 4 days ago 0 replies      
C -> Redis

I haven't written any C for years, but really enjoyed skimming through Redis codebase, it's so clean, easily understandable and extensible.

villek 4 days ago 2 replies      
I found the annotated source code of the underscore.js to be very educational: http://underscorejs.org/docs/underscore.html
kjs3 4 days ago 0 replies      
I learned a huge amount about how real operating systems are put together and the compromises that get made by reading the V6 Unix source via John Lions Commentaries (yes...I had a photocopied copy). Made exploring the BSD 4.2 and 4.3 source trees (another worthwhile exercise) much easier. I suppose if I was starting out today and not in 1985 I'd look at xv6 or Minix.
nicholassmith 4 days ago 0 replies      
I had a read through the PCSX2 emulator recently, that was quite interesting: https://github.com/PCSX2/pcsx2 it's a complex project in what was surprisingly readable C++ code.
riffraff 4 days ago 0 replies      
Not a specific codebase, but I went through "Code Reading"[0] many years ago, I found it interesting. Most reviews are not very positive though, so maybe it was just at the right point for me.

[0] http://www.amazon.com/Code-Reading-Open-Source-Perspective/d...

laichzeit0 4 days ago 1 reply      
Eric S. Raymond wrote a book The Art of Unix Programming [1] that has many "case studies" as well as recommendations of which software/RFCs are particularly worthy of study.

[1] http://www.faqs.org/docs/artu/

jacquesm 4 days ago 5 replies      

  C -> Varnish  PHP -> Yii   Ruby -> Merb  Scheme -> Arc  Clojure -> Core  JavaScript -> Multeor
Any languages in particular that you're interested in not covered above?

patrickg 4 days ago 0 replies      
I suggest the source code of TeX. Not new, but still very interesting to read.

source that needs some postprocessing (tangle/weave):


PDF from the source (including hyperlinks)


davedx 4 days ago 2 replies      
* BackboneJS

* UnderscoreJS

budu3 3 days ago 0 replies      
The old jQuery 1.6.2 code by John Resig is a good start for studying good JavaScript coding practiceshttp://robflaherty.github.io/jquery-annotated-source/
tlrobinson 3 days ago 2 replies      
Lots of great suggestions here, but I'm interested in how you go about reading source code, especially very large codebases?
collyw 4 days ago 1 reply      
Slight tangent to your question, but one thing I have noticed recently is that having to deal with really crap code inspires me to do my own better.

I inherited a colleagues work after she left, and it was horrible. But I thought about why it was horrible, and how to make it better. What would it look like if it was done well?

Even with my own code, if I look at something I did 6 months ago, and it doesn't make sense straight away, the it can usually be improved.

pincubator 3 days ago 1 reply      
Also can someone suggest what is the best way to approach code reading? When I open a library in Python, I am not sure where to start reading, just a bunch of files. Should I randomly pick one file and start reading from there? Is there any common strategy?
redox_ 4 days ago 0 replies      
For all low-level I/O details (fflush/fsync/fsyncdata on files/directories after creation/renaming), I've used to read MySQL routines, pretty simple to understand: https://github.com/twitter/mysql/tree/31d6582606ddf4db17ad77...
entelect 4 days ago 0 replies      
rasur 3 days ago 0 replies      
Anything by Fabrice Bellard (Google him, it's worth it).
nextos 3 days ago 0 replies      
I think 2 suggestions by plinkplonk in the original thread would be still relevant:

Common Lisp - "Paradigms of Artificial Intelligence Programming" by Peter Norvig and "On Lisp" by Paul Graham

C - "C Interfaces and Implementations"

Minix 1 and XMonad are also very good suggestions too.

chris_wot 4 days ago 0 replies      
It's not great code (though I'm working to make it so), and perhaps not the intent of this question - but if you want to looking at a 25+ year old codebase that's being refactored, check out LibreOffice, especially the VCL component:


j_s 4 days ago 0 replies      
In the .NET world, shanselman has a series of Weekly Source Code blog posts and most recently posted a list of seven 'interesting books about source and source code'.


tuxguy 3 days ago 0 replies      
There is a good book on this themehttp://aosabook.org/en/index.html

, where the authors actually go deep into the design & architecture of selected well designed open source projectse.g. llvm, git, freeswitch, etc.

Highly recommended !

lightyrs 3 days ago 0 replies      
I find anything by https://github.com/jashkenas to be transparent and enlightening.
rabino 3 days ago 0 replies      

To learn how to document code.

borntyping 4 days ago 0 replies      
Python: Flask (and related projects
vishnugupta 4 days ago 0 replies      
I'm fascinated by concurrent programming. I find that reading classes from Java's java.util.concurrent package gives me very good practical insights as to what goes into building a concurrent class. My all time favorite is ConcurrentHashMap :)
diegoloop 4 days ago 0 replies      
I made this tool: http://codingstyleguide.com to improve the way I code for different languages and not get lost with too much programming information and it's helping me a lot.
twunde 4 days ago 1 reply      
For PHP, I've been very impressed by Phabricator's code (and the related phutils library). It's worth looking at the git commits as well to see just how clean and structured commits can be.I'm much more impressed by it than by any PHP framework code I've read (and I've read Zend, Symfony2, li3, codeigniter as well as custom frameworks)
maccard 3 days ago 0 replies      
I'm interested in Game Development, specifically physics simulation and graphics programming. The box2D code (C) is fantastic.
raju 4 days ago 1 reply      
Any suggestions for Clojure projects?

[Update: Oops. I missed the "Clojure -> Core" by jacquesm]

zengr 3 days ago 0 replies      
If you are into java, ElasticSearch is very nicely written by Shay Banon.


qwerta 3 days ago 0 replies      
For Java I highly recommend H2 SQL DB. It has everything (parsers, sockets, webui...) in very tight and nice package.
snarfy 3 days ago 0 replies      
If you are interested in rendering engines I suggest Irrlicht. It's fairly clean and easy to understand.
Hydraulix989 4 days ago 1 reply      
C -> nginxC++ -> Chrome
dfkf 4 days ago 0 replies      
db48x 4 days ago 0 replies      
TeX the Book is good, even if it is in Pascal.
dschiptsov 4 days ago 1 reply      
ddz 3 days ago 0 replies      
Find yourself a copy of this. Not only did it play a crucial role in the history of the UNIX/Linux world, it is a gold mine for understanding operating systems.http://en.wikipedia.org/wiki/Lions%27_Commentary_on_UNIX_6th...
s_dev 4 days ago 0 replies      
I've heard that reading the Git source code is very beneficial but haven't done it myself yet.
eadler 3 days ago 0 replies      
FreeBSD kernel & userland
willvarfar 4 days ago 1 reply      
(You say the 'naive' way; how can it be compressed better?)
visualR 4 days ago 0 replies      
DonHopkins 3 days ago 0 replies      
The original source code to Zork in MDL. It doesn't matter if you don't know MDL. It's such beautiful code that it just explains itself to you. And if you've played Zork, it's like being invited to explore the underground backstage areas of Disneyland.


DonHopkins 3 days ago 0 replies      
I highly recommend checking out http://voxeljs.com for some beautifully factored JavaScript npm packages, that implement a lot of Minecraft and more in the browser.

Max Ogden's talk (the first video on that page, also here: https://www.youtube.com/watch?v=8gM3xMObEz4 ) about how voxeljs and browserify work is inspirational, and his energy, motivation, deep understanding and skill, thirst for learning, reading other people's code, building on top of it, and sharing what he built and learned, is extremely contagious!

You may want to pause the video frequently and take notes -- there is so much great information in there, and he covers a hell of a lot of amazing stuff.

And the source code is really nicely broken up into lots of little npm modules that you can plug together to make all kinds of cool stuff.

This stuff is a great fun starting point for teenagers or students to learn how to program and create their own games and web applications, or master programmers to learn the node.js / npm ecosystem and idioms. There are some great ways for new and non-programmers to get into it.

He says "Everyday I work on it I get more motivated to work on it" -- and you will too!

What you will be benefitting from by watching his video and reading his code, is the fact that he actually did a survey of a HUGE amount of code, and took the best, read it, learned from it, rewrote it, and built on top of it.

"So many people have written voxel stuff, that I should just copy them." He used github search, and searched for minecraft, filtered by javascript, and went through ALL 23 PAGES of projects! He cloned ALL the repos he found, and read the ones that seemed promising, cloned them, got them running, understood how they worked.

A lot of them were the classic genius programmer projects, really impressive visually, super hard to understand, a giant lib folder with 50 files, everybody writing their own 3d engine.

Then he found out about three.js, and learned that, and combined all the stuff he had seen on top of it, including a PhD project in computational geometry that showed how to efficiently implement minecraft with three.js, for removing interior faces, etc.

So he learned from and built on top of all that great stuff, and made voxel.js and an insane amount of demos. Now the community has written a whole bunch of nice modular node.js npm modules and demos, that browserify can combine them together into a package that runs in the browser.

My only trivial beef with it is that their style guide says not to use trailing semicolons! That makes emacs very irritated and it breaks out in a rash.

But other than that, the code is very clean and modular and comprehensible, and opened my mind to a lot of stuff that I didn't realize was possible.

RhysU 3 days ago 0 replies      
marincounty 4 days ago 0 replies      
Get to know the command line before you start any language.
plicense 4 days ago 0 replies      
Everything at Google.
How Americans Die bloomberg.com
353 points by minimax  4 days ago   140 comments top 31
tokenadult 3 days ago 3 replies      
About three or four slides in you get the take-away message, which is often missed in discussions about mortality here on Hacker News: "If you divide the population into separate age cohorts, you can see that improvements in life expectancy have been broad-based and ongoing." And this is a finding that applies not only to the United States, but to the whole developed world. I have an eighty-one-year-old mother (born in the 1930s, of course) and a ninety-four-year-old aunt (born in the 1920s) and have other relatives who are quite old and still healthy. Life expectancy at age 40, at age 60, and at even higher ages is still rising throughout the developed countries of the world.[1] An article in a series on Slate, "Why Are You Not Dead Yet? Life expectancy doubled in past 150 years. Heres why."[2] explains what incremental improvements have led to better health and increased life expectancy at all ages in the United States. The very fascinating data visualizations in the article submitted today highlight the importance of research on preventing suicide, reducing drug abuse, and preventing senile dementia such as Alzheimer disease, which is where some of the next progress in prolonging healthy life will have to come from.

Professional demographers try to think ahead about these issues, not least so that national governments in various countries can project the funding necessary for publicly funded retirement income programs and national health insurance programs. Demographers have now been following the steady trends long enough to make projections that girls born since 2000 in the developed world are more likely than not to reach the age of 100,[3] with boys likely to enjoy lifespans almost as long. The article "The Biodemography of Human Ageing"[4] by James Vaupel, originally published in the journal Nature in 2010, is a good current reference on the subject. Vaupel is one of the leading scholars on the demography of aging and how to adjust for time trends in life expectancy. His striking finding is "Humans are living longer than ever before. In fact, newborn children in high-income countries can expect to live to more than 100 years. Starting in the mid-1800s, human longevity has increased dramatically and life expectancy is increasing by an average of six hours a day."

I was in a local Barnes and Noble bookstore back when I was shopping for an eightieth birthday gift (a book-holder) for my mom, and I discovered that the birthday card section in that store, which is mostly a bookstore, had multiple choices of cards for eightieth birthdays and even for ninetieth birthdays. We will be celebrating more and more and more birthdays of friends and relatives of advanced age in the coming decades.

[1] http://www.nature.com/scientificamerican/journal/v307/n3/box...

[2] http://www.slate.com/articles/health_and_science/science_of_...

[3] http://www.prb.org/Journalists/Webcasts/2010/humanlongevity....

[4] http://www.demographic-challenge.com/files/downloads/2eb51e2...

wtvanhest 4 days ago 1 reply      
The data is interesting, but somewhat difficult to draw conclusions from without considering how different rates are impacting other rates. What is really noteworthy here is the approach to showing the data. Its effortless to scroll through.

Here are some things I noticed after the fact:

1. I naturally wanted to finish the presentation and was compelled to click to see if there were any amazing insights.

2. After the fact, I have no idea how I even advanced the presentation, all I knew was that I clicked something. It was 100% natural.

It fully pulled me in. I can't remember if there were ads on the sides or more information.

[added] I went back and looked at it again and I think what made it so flawless is that the first page gave me no option but to click the right hand arrow which taught me what to look for. I clicked the right arrow, and then I knew to click it again to advance. The progress dots on the top let me know that I didn't have much time left. Really amazing work here.

webwright 4 days ago 3 replies      
Ugh, the fact that many of these charts show raw # of deaths versus deaths/100k really masks how much things have improved. In 1968, the population was 64% of our current population... So a flat line is actually a pretty massive improvement.
minimax 4 days ago 0 replies      
If you liked this, you might enjoy some of their previous articles. It's interesting to see them iterating on the technique.

Consumer spending (from last December): http://www.bloomberg.com/dataview/2013-12-20/how-we-spend.ht...

Housing prices (from February): http://www.bloomberg.com/dataview/2014-02-25/bubble-to-bust-...

brudgers 3 days ago 4 replies      
"And, how do suicide and drugs compare to other violent deaths across the population? Far greater than firearm related deaths, and on the rise

In 2010, 19,392 of the 38,364 suicides were "by discharge of firearm" [the same term used for classifying 11,078 homicides and 606 accidental deaths]. Seems a bit odd that the report classifies the accidents and homicides as "firearm related deaths" but the suicides as unrelated.

From a public health perspective, a 50% reduction in suicide by firearm would save more lives than the complete elimination of HIV deaths or cervical cancer deaths or uterine cancer deaths.


ef47d35620c1 3 days ago 2 replies      
I heard once that one cigarette a day as a stress relief may actually extend your life. I'm not sure about that, but I do think we need to be mentally and emotionally healthy too. Our health and well-being is not purely physical.

I would think that happy people who are not constantly under stress live longer.

mberning 4 days ago 6 replies      
Any info on how they create these visualizations? Are they using any particular libraries or frameworks?
imgabe 4 days ago 1 reply      
So in 1968 all age cohorts had the exact same mortality rate of 100 per 100,000? Why is that?
ABNWZ 4 days ago 4 replies      
"This is particularly striking since cancer and heart disease - the two biggest killers for 45-54 yr olds - have become much less deadly over the years"

Except your graph shows that cancer death rates have increased by almost 20% from 1968-2010... Am I missing something here?

richev 4 days ago 4 replies      
Very nice graphs and visualisations, but am I alone in finding most of them hard to understand?
rpedela 4 days ago 4 replies      
The part about suicides is pretty interesting and perplexing. Are there any insights into why the rate has increased?
Pxtl 4 days ago 0 replies      
Maybe we should have a war on drugs, then. I'm sure that would work.

Getting guns out of our communities is probably easier than getting drugs out of them, not to mention mental conditions that lead to suicide.

kafkaesque 3 days ago 1 reply      
I got the presentation's/graph's main takeaway, but did anyone else notice that women's mortality rate hardly changed since 1968? Why was this, I wonder? Is this a population thing or because women were mostly kept inside doing safer house duties or what?
drinkzima 3 days ago 0 replies      
Pretty incredible user experience on mobile, haven't seen graphs that look that good in a mobile web browser (and interactive no less).
infosample 4 days ago 4 replies      
Black males die at such a higher rate from AIDS. Are they having that much more unprotected sex, taking that many more drugs from dirty needles, or getting that much inferior treatment than the general population?
cheetahtech 4 days ago 3 replies      
It interesting to see that drugs and suicide are the highest causes of death, well over that of guns. But we seem to be progressing more towards a drug open world and gun closed world. Do you see the Irony?
dclowd9901 3 days ago 0 replies      
If whomever contributed to the code on this is around, could you give us some insight into building this app, or do a writeup? I'd be super interested to see how you designed/architected such a smooth and experience.
dmritard96 4 days ago 0 replies      
"progress stopped in the mid 1990s"maybe i am missing something but it seems like the mortality rate would be a lagging indicator progress hence progress would have "stopped" earlier?

Not that I necessarily would say it stopped at all...

bittercynic 3 days ago 1 reply      
I couldn't figure out any way to navigate without using the mouse.
matthewisabel 4 days ago 1 reply      
I created a visualization on a similar topic that looked at mortality rates state-by-state using the 2010 census data. It was on HN about six months ago.


0003 4 days ago 1 reply      
Any reason why the 75-84 group was out living the 85+ group until recently?
jmnicolas 3 days ago 0 replies      
I don't care much about the topic, but I thought this is a great way to present data !
RobotCaleb 4 days ago 0 replies      
That's neat, but it's very hard to tell the colors apart.
fophillips 4 days ago 0 replies      
Need some error bars on that data.
devanti 4 days ago 0 replies      
Surprised how nice the visualization looks, given how ugly the Bloomberg terminal is
brokenrhino 4 days ago 0 replies      
I wonder is the drop in car accident death caused by;1) Cash for clunkers taking old dangerous cars off the road so the fleet consists of more newer safer carsor;2) People driving less since the recession and the gas price increases
jon_black 3 days ago 1 reply      
Everyone knows that the most Americans actually die in terrorist attacks. How else can you justify such emphasis on fighting it? Hmmmmmmmm.
dragontamer 4 days ago 0 replies      
<script src="global/js/jquery-1.8.3-min.js" charset="utf-8"></script>

<script src="js/modernizer.2.7.1.js" charset="utf-8"></script>

<script src="js/underscore.1.5.2.js" charset="utf-8"></script>

<script src="global/js/less.js" charset="utf-8"></script>

<script src="global/js/d3.v2.js" charset="utf-8"></script>

<script src="js/jquery.cycle.all.js" charset="utf-8"></script>

It looks like the majority of this visualization was from the D3.js library. I've been seeing more and more web-documents of this style, it must be because of the rise of D3.

EGreg 3 days ago 0 replies      
"That's why total deaths in the 75+ category has stayed constant"

I thought that was a particularly funny statement. Reminded me of the onion: http://www.theonion.com/articles/world-death-rate-holding-st...

joshuak 4 days ago 4 replies      
So to achieve longevity escape velocity [0]

1. Don't have unprotected sex if you're less than 44 years old.

2. Don't kill yourself, or do drugs, if you're less than 54 years old.

3. Invest heavily in heart disease, cancer, and alzheimer's research.

[0] http://en.wikipedia.org/wiki/Longevity_escape_velocity

ihodes 4 days ago 5 replies      
Probably the four most important things you can do to change your odds of making it past 80 are:

    1. Not smoking.    2. Eating healthily (fiber, vitamins, low sugar; this is a nascent field).    3. Exercising regularly.    4. Wearing sunscreen and minimizing sun exposure.
These will collectively reduce your risk of common cancers significantly, as well as protect against heart disease. Additionally, they can help strengthen your immune system and body against other diseases that e.g. the malnourished or obese would be more likely to succumb to.

SpaceX Dragon Successfully Docked With The Space Station forbes.com
352 points by lelf  19 hours ago   62 comments top 10
throwawaymsft 14 hours ago 2 replies      
Congrats to the SpaceX team, always inspiring.

The correction made in the article (5000 tons => 5000 lbs) shows the importance of having mental visualizations for basic numerical concepts.

5000 lbs of cargo is about an SUV. I can imagine a rocket carrying that up.

5000 tons of cargo is... a giant Wal-mart parking lot full of SUVs (2000 of them). Stacked, they'd make a tower of metal a few miles high. (How? Figure 6 feet tall x 2000 = 12000+ feet.)

Can you imagine a rocket carrying that up and dropping it off at the space station? Nope.

When numbers are just symbols, mistakes like this are easy to make.

modeless 19 hours ago 5 replies      
Any more news on the performance of the first stage recovery test? Elon Musk tweeted that it was successful but I haven't heard anything more detailed than that.
rwitoff 19 hours ago 0 replies      
After 3 scrubs most of our non-critical colleagues left the cape, but the view of our payloads in the trunk after 2nd stage sep made it all worthwhile. It's amazing what SpaceX has been able to accomplish and reinvent in a business this risk-averse and I can't wait to see what's next.
Spittie 19 hours ago 1 reply      
SpaceX/Nasa livestreamed the whole docking (sadly I was asleep at the time...).

I don't know if the whole footage is up yet, but this is a nice summary: https://www.youtube.com/watch?v=3fDzvdEfSgc

bane 18 hours ago 1 reply      
You know, thinking over https://news.ycombinator.com/item?id=7617720 I almost wonder why Musk hasn't set an interim step to Mars that involves space habitats. Right now SpaceX is probably the closest to making an economy based on


and http://www.planetaryresources.com/

working. If they get the 1st stage reusable it brings the bootstrap costs down significantly.

vermontdevil 13 hours ago 0 replies      
They also tested the new replacement to grasshopper called Falcon 9R Dev


pearjuice 17 hours ago 5 replies      
I am skeptical of SpaceX its future endeavours. They proved they are on par with government subsidized space programmes, but haven't done anything astonishing yet. I mean, they probably bought a majority of the tech from NASA or the Soviets and just made it work. Not that it isn't impressive, just that an actual own in-house rocket would have been more ground breaking.
jokoon 8 hours ago 1 reply      
any video of the landing of the 1st stage, which is not a test ?

or is spacex still using expendable stages ?

iotakodali 19 hours ago 0 replies      
an awesome start for reusable rockets, amazing!!
frade33 19 hours ago 2 replies      
Just do not watch gravity movie tonight. on a serious note, it's friggi' amazing and makes me feel proud of our race every single day.
Google's Street View computer vision can beat reCAPTCHA with 99% accuracy googleonlinesecurity.blogspot.com
321 points by apawloski  4 days ago   146 comments top 45
zwegner 4 days ago 13 replies      
This particular issue (AI performance on captchas) is really quite fascinating. It's an arms race, but the problem is, only one side can win. Google is claiming they have improved their system in some (understandably) unspecified way, but there's only so far this can go. Captchas need to detect whether someone is human, but it has to work for everyone, ideally, even those with disabilities. Any simple task a human can do will eventually be able to be automated. Tasks that aren't currently feasible to be automated, say some natural language processing tasks, have another problem: scalability. To prevent simple databases of problems -> solutions, the problems need to be generated en masse, and for cheap, which means a computer needs to generate the solutions in the first place. And of course, paying people to just do captchas all day already happens.

The street address/book scan approach that Google uses is interesting in that the exact solution is not known, so they presumably have to be somewhat forgiving in accepting answers (as their machine learning might have gotten it wrong). Perhaps this is what their "risk analysis" refers to--whether their response seems "human" enough according to their data, not necessarily whether it's correct.

I don't see a way around this problem for free services that still preserves privacy (so directly using some government-issued ID is off the table). Maybe some Persona-like digital signature system, where a person can go to a physical location with a government ID, and get a signature that says "Trusted authority X affirms that person Y is in fact a person". Obviously this still has problems, as you need to trust X, and it's a big pain in the ass.

There are parallels to the realm of passwords, which are also becoming obsolete (not that there's a good replacement...). Anything that a human can feasibly remember for a bunch of sites is becoming easier and easier for computers to guess.

So basically, computers are taking over the world, and we can't do anything to stop it. God help us all.

josho 4 days ago 3 replies      
Interestingly I activated a new gmail account today and during the signup process I experienced the obligatory captcha. It was in two parts, the first looked strikingly like a street view picture of a house number, while the second looked like a traditional captcha.

I suspect that google has been using techniques like this to validate their computer vision conclusions. Which makes their 99% assertion even more interesting, because it's likely 99% confirmed by a very large crowd sourced data set, not simply a staff member going through several hundred samples to come up with the success rate.

jrochkind1 4 days ago 2 replies      
From that caption "CAPTCHA images correctly solved by the algorithm", there are at least two of them that I'm not sure _I_ can correctly solve on the first try.

Which is generally my experience with captcha's these days, I only have about a 50% success rate.

CAPTCHA is a failed strategy, time to give it up.

adyus 4 days ago 4 replies      
In effect, Google computer vision got so good that they made their own system obsolete. This is a good thing.

I still think the only reliable way to confirm identity (or humanity) online is an email or SMS verification. Recently, receiving a 2-factor SMS code took less time than the page refresh prompting me to enter it.

zobzu 4 days ago 1 reply      
The program solves captcha that I, as a human, cannot solve.Pretty sure that means captcha of that type are definitely dead.
frik 4 days ago 0 replies      
Google's reCAPTCHA showed street numbers as one of the two catcha-"words" for more than two years.

For me this was quiet annoying to input street numbers of others. It's a privacy issue, it was like helping the NSA spying and one feels bad entering Google's captcha.

What is even more astouning is that Google does not even mention all the croud sourced "volunteers" that trained their OCR database. As Google use an open OCR software (former HP OCR app from '95) it would be a good choice to publish their data back to the community.

I removed Google captcha on my own sites and implemented my own traditional captcha (on the first sight of it about two years ago).

jere 4 days ago 3 replies      
>In this paper, we show that this system is able to accurately detect and read difficult numbers in Street View with 90% accuracy.

> Turns out that this new algorithm can also be used to read CAPTCHA puzzleswe found that it can decipher the hardest distorted text puzzles from reCAPTCHA with over 99% accuracy.

Am I missing something or could we improve CAPTCHAs by mimicking street numbers?

dnlbyl 4 days ago 3 replies      
99% is probably better than my success rate with reCAPTCHA...
ilitirit 4 days ago 0 replies      
To be honest, I can't even solve those reCAPTCHAs on that page (that's one of my biggest gripes about reCAPTCHA). I think we're nearing a point in time where if some(thing) can solve a particularly hard CAPTCHA, we can safely assume that it's not human.
pacofvf 4 days ago 0 replies      
well there are a lot of Resolve CAPTCHA as a Service sites like http://www.9kw.eu/
dlsym 4 days ago 0 replies      
"CAPTCHA images correctly solved by the algorithm" - Ok. Now I have to consider the possibility of being a machine.
msvan 4 days ago 1 reply      
Here's a captcha idea: make people write a 100-word essay on a specific topic. If it's good, you're accepted and you won't have to do it again. If it's bad, you're either a computer or cheap Nigerian labor. When we get to the point where we can't distinguish a computer from a human, we'll just let them be a part of the community.
zatkin 4 days ago 1 reply      
But can it beat CRAPCHA? http://crapcha.com/
rasz_pl 4 days ago 0 replies      
Does google aggregate&correlate data in vision algo?

For example for street numbers they not only have picture of a number, they also have knowledge of all the other numbers on that street and guesses for those other numbers. Easy to guesstimate order of a number by checking neighbouring ones.

Same for book words, they have n-gram database.http://storage.googleapis.com/books/ngrams/books/datasetsv2....

Thats a lot of useful MAP/ML data.

But the example they give for the new captchas all look like random crap, "mhhfereeeem" and the like. Its like they are not interested in structure, just pure geometry of letters/numbers.

drawkbox 4 days ago 0 replies      
99% is better than most humans captcha accuracy. Back in my day humans could still beat computers at Chess but nowadays computers can beat humans at Jeopardy and drive. Interesting to see when it fully crosses over.
aaronbrethorst 4 days ago 0 replies      
I'm impressed that their address identification algorithm can solve those CAPTCHAs. I can't make heads or tails of them.
shultays 4 days ago 0 replies      
My accuracy is way below 99%, good job Google!

Seriously though, I hope this does not mean there will be harder captchas, current ones are already stupidly hard

mrsaint 4 days ago 2 replies      
Captchas were meant to keep spammers at bay. Unfortunately, that's no longer the case. Thanks to "cloud technology" like DeathByCaptcha - that is, people in countries where labor is cheap solving captchas all day - spammers have no problem getting through reCaptcha-protected sites and forums to do their mischief.

As a result, reCaptcha & co tend to be more of an annoyance to honest visitors than to spammers.

spullara 4 days ago 0 replies      
Reminds me of a hack day at Yahoo where one team made a captcha where you had to match a photo with its tags and another team made an algorithm that would assign tags to a photo. Both based on Flickr humans meant that the captcha was easily solvable by the algorithm.
infinity0 4 days ago 0 replies      
Ironic how the HTTPS version force-redirects you to HTTP. (Amazon.co.uk started doing this a few days before and it's pissing me off no end.)
spullara 4 days ago 0 replies      
So, now if you get the captcha right you're a computer, otherwise you are a human?
pestaa 3 days ago 0 replies      
Folks, I figured it out! Let's use captcha so that visitors can prove they are robots! If you fail these captchas, you must certainly be a human!
aviraldg 4 days ago 0 replies      
Isn't this expected (and a natural consequence of the fact that it's trained on huge volumes of reCAPTCHA data?)
daffodil2 4 days ago 1 reply      
Wait, it's not clear to me from the blog post. Did they make a system that obsoletes reCAPTCHA? If so, it's just a matter of time before the spam systems catch up, correct? If so, what's the successor to CAPTCHA? Or is the web just going to be full of spam in the future?
plg 4 days ago 1 reply      
Why isn't google releasing the full algorithm?
varunrau 4 days ago 0 replies      
I've always felt that it would be only a matter of time before computer vision would be able to solve the (re)CAPTCHA problem. Especially since digit classifiers are able to match the performance of humans.

One approach that I enjoyed seeing was the use of reverse captchas. Here you pose a problem that a computer can easily solve, but a human cannot. For instance, if you ask a simple question (1+1=?), but you place the question box off the screen so the user can't see it. A computer would be able to easily answer the question, but a human user would have no way of doing so.

rasz_pl 4 days ago 0 replies      
>CAPTCHA images correctly solved by the algorithm

well, isnt that great? Because I, HUMAN, can maybe solve _one_ of those (lower right one).

I frickin HATE google Captchas and simply close the page if it wants me to solve one, they are too hard for me.

pavelrub 4 days ago 0 replies      
This is essentially the technology that was discussed here 3 months ago [1], and it links to the exact same article on arxiv, titled: "Multi-digit Number Recognition from Street View Imagery using Deep Convolutional Neural Networks". [2]

They new addition to the article is that now they have tested the same type of NN on reCAPTCHA, and (perhaps unsurprisingly) it works.

[1] - https://news.ycombinator.com/item?id=7015602[2] - http://arxiv.org/abs/1312.6082v4.

aljungberg 4 days ago 0 replies      
Google software could use their 99% successful algorithm to filter potential captchas. Then show the 1% they can't crack to humans.

Now the race becomes who can write the better captcha solver, Google or the spammers? As spammers learn to identify things in the 1%, Google will hopefully improve faster and continue to narrow the "hard to solve" band.

tsenkov 4 days ago 0 replies      
It's fascinating how, arguably simple software now, which is the captcha, would inevitably become more and more complex as AI develops.
northisup 4 days ago 0 replies      
Yet it says I'm a robot a good two of three times.
stuaxo 4 days ago 0 replies      
I'm sorry, as a human I have had to fill these street view style captchas in all the time for google, so this is hardly a completely artificial intelligence, humans have done it many many times, in fact I'm sure some of the pictures in the articles have come up.
leccine 4 days ago 0 replies      
blueskin_ 4 days ago 0 replies      
Great... now they are going to get even harder to actually do.
exadeci 4 days ago 0 replies      
You're welcome google (we are the rats labs that teached their system how to read)
peterbotond 4 days ago 0 replies      
what if someone has bad eyes of some rare eye problem and can not solve captcha problems at all? in other words fails captcha 90% of times.
EGreg 4 days ago 0 replies      
Basically consider why we want to eliminate computers from accessing the sites -- because we want to make account creation expensive, to prevent sybil attacks and giving away scarce resources.

What is expensive? Reputation. That's where credit money's value comes from.

I wrote a more comprehensive piece here, in case anyone's interested: https://news.ycombinator.com/item?id=7601690

vfclists 4 days ago 0 replies      
Google are getting too creepy for any sensible persons liking. Addresses which are off the street in apartment complexes are now getting recognized as well.

Whenever I see these kind of captchas I switch to audio captchas. It is rather unethical for Google to use recaptchas in this way.

Keyframe 4 days ago 2 replies      
Now that programs are better and better at solving CAPTCHA - that means that correct CAPTCHA input will mean the opposite from what it means now. Since programs are better at solving CAPTCAH than humans, correct input (3/3 for example) will mean it's a robot. Thus, CAPTCHA becomes relevant again.
knodi 4 days ago 1 reply      
I just came here to say fuck reCAPTCHA! I hate it, I can't read it with my human eyes.
sajithdilshan 4 days ago 0 replies      
conectorx 4 days ago 0 replies      
this is also can be done with tesseract or encog framework... i dont know whats news about this
spcoll 4 days ago 0 replies      
It's a new success for Deep Learning. It seems to be actually 99.8% accuracy according to their paper: http://arxiv.org/abs/1312.6082

That's one order of magnitude higher.

maccard 4 days ago 0 replies      
Damn, that's better than me!
techaddict009 4 days ago 0 replies      
This is really Great. AI is getting really smarter and smarter day by day!
Another Big Milestone for Servo: Acid2 mozilla.org
314 points by dherman  3 days ago   96 comments top 10
brson 3 days ago 2 replies      
Servo is the kind of project that launches a thousand research papers. Some of the early results are staggering and the project is still just getting going. It is a great example of doing serious research to practical ends.

Some examples:

- firstly, the entire foundation, Rust, is itself an ambitious research project that solves many long-standing problems in the domain.

- Servo has, or has plans for, parallelism (combinations of task-, data-parallelism, SIMD, GPU) at every level of the stack.

- The entirety of CSS layout (one of the most difficult and important parts of the stack) is already parallelized, and it's fast.

- It puts all DOM objects in the JS heap, eliminating the nightmarish cross-heap reference counting that historically plagues browser architectures (this is part of Blink's "oilpan" architecture).

modeless 3 days ago 4 replies      
I want Rust scripting support in Servo.

  <script type="text/x-rust" src="foo.rs">
Since Rust is a safe language this should be possible without compromising security, though I don't think anyone's yet attempted to write a JIT compiler for Rust. Has the Servo team considered this as a possibility?

ChuckMcM 3 days ago 3 replies      
This is awesome, I wonder if there is a more constrained web rendering engine somewhere. Something where rather than 'render everything we've ever seen' is 'render the following html 'standards' correctly' (or at least predictably). I was looking for something like this for a modern day sort of serial terminal thing.
bithush 3 days ago 1 reply      
With the bad press Mozilla has had the past few weeks it is easy for people to forget about some of the awesome things Mozilla are working on such as Rust and Servo. I really like the look of Rust and feel it might be the future native language for high performance applications. It is very exciting!
talklittle 3 days ago 1 reply      
schmrz 3 days ago 4 replies      
> Many kinds of browser security bugs, such as the recent Heartbleed vulnerability, are prevented automatically by the Rust compiler.

Does anyone care to explain how this would work? If you used OpenSSL from Rust you would still be vulnerable to Heartbleed. Or am I missing something?

macinjosh 3 days ago 1 reply      
This is what I see when I run Acid2 in Servo. Perhaps they haven't merged the changes in to the public repo yet.


acqq 3 days ago 2 replies      
Is Servo using GC?
camus2 3 days ago 5 replies      
When can I expect Servo to be in Firefox instead of the current engine? 2015/2016? do you have a rough idea?
sgarlatm 3 days ago 1 reply      
I'm curious what Chrome's plans are for the future, in particular related to parallelization. Has anyone seen any articles about that anywhere?
An Update on HN Comments
312 points by sama  3 days ago   261 comments top 43
bravura 3 days ago 7 replies      
I appreciate the changes. But while we're on the topic, could I throw out a thought?

It should be easier for a late-arriver on a post to add a useful comment, and have it be promoted. Have you considered using randomization to adjust the score of certain comments?

HN comments seem to exhibit a rich-get-richer phenomenon. One early comment that is highly rated can dominate the top of the thread. (I will note that, qualitatively, this doesn't seem as bad as a few months ago.)

The problem with this approach is that late commenters are less likely to be able to meaningfully contribute to a discussion, because their comment is likely to be buried.

One thing interesting about the way FB feed appears to work is that they use randomization to test the signal strength of new posts.

Have you considered using randomization in where to display a comment? By adding variation, you should be able to capture more information from voters about the proper eventual location for a comment. It also means more variation is presented to people who are monitoring a post's comments.

alain94040 3 days ago 10 replies      
I'd love to able to fold a nested conversation once I think that particular branch is going nowhere. HN should treat the folding as a signal similar to a down vote on that particular sub-thread. I often don't think any particular comment warrants a down vote, so I have no way to tell HN that the thread should be pushed back.

Plus everyone has been asking for a way to collapse sub-comments (and many plugins do it already).

jseliger 3 days ago 5 replies      
dang and kogir tuned the algorithms to make some downvotes more powerful. We've been monitoring the effects of this change, and it appears to be reducing toxic comments.

That's interesting to me because I find myself downvoting much more often than I used to. But the comments I downvote are not that often toxic in the sense of being nasty. They're more often low-content or low-value comments that don't add to the conversation.

The jerks and trolls are out there but I'm not positive they're most pernicious problem.

rdl 3 days ago 6 replies      
I wish there were multiple kinds of downvotes. "This is actually bad" (spam, etc.) vs. merely useless, vs. factually incorrect but reasonably presented.

I mostly only downvote spam or abuse; I try to ignore "no-op" comments, and would rather reply to someone with information about why they might be wrong vs. downvote, but I'm not sure if this is universal.

codegeek 3 days ago 3 replies      
"make some downvotes more powerful."

Yes this will be great. Any comment that has personal attacks,abusive language, racial slurs, trolling, off-topic self-promotion/marketing etc. should allow downvotes to be more powerful. Usually, comments like these get a lot of downvotes pretty quickly but I am sure there are a few who upvote those comments as well for their own reasons.

May be comments like those should not be allowed upvotes once it reaches a number of downvotes ? Also, not sure if you guys already do this but really bad comments should be killed automatically once downvoted a certain number of times within a short time span ?

Now, when it comes to unpopular comments which are not necessarily outright bad, I am sure those are tough to program because how do you handle the sudden upvotes and downvotes at the same time ?

minimaxir 3 days ago 9 replies      
While on the subject of HN comments, I have a request: could the "avg" score for a user be readdressed?

The avg score is the average amount of points from the previous X comments a user has made. However, this disincentivizes user from posting in new threads which are unlikely to receive upvotes. I've lessened my own commenting in new threads because of this.

stormbrew 3 days ago 1 reply      
Something that I've been finding lately is that replies to my posts have been downvoted when to me they're fairly reasonable disagreements with what I said. I've actually taken to upvoting replies to me that go grey a lot of the time, even though I don't particularly agree with what they're saying.

To me it seems like a lot more stuff is getting downvoted than used to, and I'm not sure I see a meaningful pattern in the places I see it happening.

biot 3 days ago 3 replies      
Will there ever be the ability to upvote a story without it going into your "Saved stories" section? 99% of the time I upvote a story it's because I want to save it for future reference. I'd like the ability to upvote (and downvote) stories based on whether they're HN-worthy without it impacting the "Saved stories" section.
kposehn 3 days ago 5 replies      
I'm glad to hear that these changes seem to be working. One thing I am (slightly) concerned about is the occasional funny/witty/hilarious comment that will get downvoted into oblivion rapidly. It isn't necessarily that it is a troll posting, but maybe someone injecting a bit of humor.

That said, I do understand if the mods/community do not feel that witticisms have as great an importance on HN - yes, seriously - so this is not a criticism, just an observation.

chimeracoder 3 days ago 2 replies      
> The majority of HN users are thoughtful and nice. It's clear from the data that they reliably downvote jerks and trolls

I have to say, I'm a bit confused now. Aren't "trolls" the sorts of comments that are supposed to be flagged[0]? (I understand that spam is meant to be flagged, but HN gets very few true spam comments[1]).

What is the difference between downvoting and flagging for comments specifically - and more importantly, what comments should be downvoted?

I've read conflicting arguments (both sides quoting pg, incidentally) that disagree on whether or not downvotes should be used to signify disagreement, or whether one should downvote comments that are on-topic but have little substance (ie, most one-liners).

[0] I guess this depends on your definition of "troll", but I think a well-executed troll is similar to Poe's law: the reader can't tell whether the commenter is being flippant/rude or sincere. In other words, it's just enough to bait someone into responding, without realizing immediately that it's a worthless comment.

[1] eg, ads for substances one ingests to change the size of a particular masculine organ, or (less blatantly) direct promotions for off-topic products.

Serow225 3 days ago 1 reply      
Dang and friends, any chance of tweaking the layout so that it's not so easy to accidentally click the downvote button when using a mobile browser? This is commonly reported. Thanks!
mbillie1 3 days ago 1 reply      
> The first is posting feedback in the threads about what's good and bad for HN comments. Right now, dang is the only one doing this, but other moderators may in the future.

I've seen dang do this and I think it's actually quite effective. I'd love to see more of this.

maaaats 3 days ago 0 replies      
I like the new openness.
specialk 3 days ago 1 reply      
I find the idea that commenters with higher karma having more powerful down-votes slightly disconcerting. My fear is that if people down-vote comments that are well meaning and relevant but they disagree content we will only ever see one train of thought rise to the top of comment threads.

This could start a vicious cycle where voting cabals of power-users form. For example if Idea X becomes popular among some members of HN they will be able to always steer the discussion to talk about Idea X or down-vote a competing valid Idea Y into oblivion. Comment readers could be converted to Idea X, as it is always appearing at the top of relevant comment threads. So now the voting cabal as even more members. Growing the dislike of Idea Y. The cycle then repeats. The discussion is then steered over time by the thoughts of a select few power-users.

Maybe this is just the natural order of things and I'm subconsciously afraid of change. Thoughts?

joshlegs 3 days ago 0 replies      
Wow. I am overly happy that you guys have figured out a way to give commenting feedback. i had an account way back when shadowbanned for i never knew what reason. Still dont. I feel like if this system had have been implemented back then I would have had a better idea of what was wrong that I said.

Also, I'm pretty sure you've found the secrets to good Internet moderatorship. So many forums went offcourse from ban-happy moderators that didnt want to actually take the time to moderate the community, instead just banhammering people. Kudos to you guys

tedks 3 days ago 0 replies      
>(and specifically, they dont silence minority groupsweve looked into this)

How have you looked into this, and what have the results been?

What efforts are you going to take to ensure it stays true in the future?

There are other comments asking these questions that have so far not been answered; it would be good to answer them. It's very unsettling when people (primarily from a privileged/majority standpoint) proclaim that things "don't silence minority groups" and handwave the justification.

In general I've found HN to be much more positive towards feminism in particular than similar communities like Reddit or others that I won't name, but the tech industry has large issues in this area and it's surprising to me that this would be the case.

In particular, it seems likely to me that HN will selectively not-silence minority voices that tend to agree with the status quo or pander to majority voices. I'd be surprised if your analysis accounted for that, but I'd be very, very happy to be wrong.

Thrymr 3 days ago 1 reply      
> posting feedback in the threads about what's good and bad for HN comments.

Am I the only one who thinks that posting more meta-discussion directly in comments reduces the overall quality rather than increases it?

Maybe a downvote should come with a chance to add an explanation that can be seen on a user's page or on a "meta" page, but not dilute the discussion itself.

User8712 3 days ago 1 reply      
Are comments ever deleted or hidden from view completely? I've been reading HN for a year or two, and I've never noticed an issue with comment quality. In topics with a larger number of comments, you get one or two heavily downvoted posts, but that's it.

My question, is there an issue with comments I'm not seeing? Do the popular topics on the homepage have dozens of spam or troll comments that are pruned out constantly, so I don't notice the problem? Or is the issue those 1 or 2 downvoted comments I mentioned earlier?

HN receives a small number of comments, so fine tuning algorithms isn't a big deal in my opinion. This isn't Reddit, where the number one post right now has 4,000 comments. That presents a lot of complications, since they need to try and cycle new comments so they all receive some visibility, allowing them a chance to rise if they're of high quality. On HN, you have 20 comments, or 50 comments, so regardless of the sorting, nearly everything gets read. As long as HN generally sorts comments, they're fine.

aaronetz 3 days ago 0 replies      
I have noticed that people oftentimes downvote because of disagreement, even when the comment seems to be okay (to my eyes at least). How about eliminating the downvote, leaving only the "flag" which makes it clearer that it should not be used for disagreement? It would also make comments more consistent with top-level stories (which I sometimes think of as "root-level comments".)
camus2 3 days ago 1 reply      
In my opinion,just like SO, downvotes should actually cost Karma. Yes sometimes some messages are just bad and trolling but sometimes people get downvoted just because they dont "go with the flow",and they have unpopular ideas. So if a downvote cost 2 , the downvoter should lose 1 for instance. And please dont downvote me just because you disagree.

EDIT: just proved my point,why am I being downvoted? it was a simple suggestion yet,someone downvoted me,just because he can and it's free. I was not trolling or anything... I just wanted to participate the debate.

olalonde 3 days ago 0 replies      
I know it would be a pretty big experiment both technically and conceptually, but I will propose it just in case.

I have noticed that usernames might influence the way I vote. What if usernames were not displayed in comments? Now this leads to two problems: 1) it makes it hard to follow who replied to what in threads 2) it makes it more tempting to post bad comments given the lack of accountability. I think the first problem could be solved by assigning users a per-submission temporary username picked at random from a name/word list. The second problem could be solved by linking those random usernames to the actual profile page of who posted (just like HN currently does). It wouldn't stop deliberate attempts at up/down voting specific users, but it would remove the unintentional bias.

mck- 3 days ago 0 replies      
May I also suggest an update to the flamewar trigger algorithm? Or at least this is what led me to believe it is a flamewar trigger [1]

Oftentimes a post is doing really well [2], accumulating a dozen up votes within 30 minutes, jumping up the front-page, but then because of two comments, it gets penalized to the third page. I can see it being triggered when there are 40 comments, but there seems to be an awfully low first trigger?

[1] https://news.ycombinator.com/item?id=7204766

[2] https://news.ycombinator.com/item?id=7578670

Bahamut 3 days ago 2 replies      
I've seen plenty of downvotes around from people who didn't understand what was being said/wanting to exert opinions. To be honest, that partly gets me to just not want to contribute thoughts since they may be unpopular/do not jive with a hive mentality, and has gotten me to visit the site less for the comments, especially with the recent tweaks.

It'd be nice if something could be figured out to discourage this behavior through reduction of the value of the downvotes of such, especially if a comment has not had a response to explain the downvote.

lettergram 3 days ago 0 replies      
"We believe this has made the comment scores and rankings better reflect the community."

It would be interesting to see how you could actually change the community via comment filtering.

For example, if some individuals are always posting negative comments and were previously not silenced. I wonder if now that they are being silenced if they would leave the community entirely, just keep posting and ignoring the results, or change their comments to fit the community.

zatkin 3 days ago 3 replies      
I recently joined Hacker News, and actually read through the guidelines before making an account. If there was one area where I feel that anything convinced me to be smart about what I post, it would be those guidelines.
raverbashing 3 days ago 0 replies      
I had a moderator intervention happen in one thread, however, I think the moderators, when speaking "on behalf of HN" should have a way to indicate that (like an indication on their username, or something similar)

Otherwise it looks like anyone just decided to intervene.

chrisBob 3 days ago 0 replies      
The biggest problem I see is that the combination of a threaded discussion and the strong ranking provides an incentive for replying to a another comment even if a new comment would be more appropriate.

This, for example, is much more likely to be buried than if I replied a few comments down on the thread from bravura.

abdullahkhalids 3 days ago 0 replies      
It would be interesting if you published stats for each user: how often they upvote and downvote compared to the average for starters.

It would also be useful to know how often other people upvote (downvote) the comments I upvote (downvote).

These stats should only be privately viewable.

gautambay 3 days ago 0 replies      
>> and specifically, they dont silence minority groupsweve looked into this

curious to learn how this analysis was conducted. e.g. how does HN determine which users belong to a minority groups?

onmydesk 3 days ago 3 replies      
"We believe this has made the comment scores and rankings better reflect the community."

Is that desirable? A better debate surely entails more than one opinion. I also don't know what a 'jerk' is, someone that disagrees with the group think?

I just don't think its that big a problem. But thats just one opinion that might differ from the collective and therefore must have no merit? An odd place. Over engineering! To be expected I suppose.

rickr 3 days ago 0 replies      
Is there a template or example post for the first item?

I've thought about doing this in the past but I didn't want to seem too elitist.

mfrommil 3 days ago 0 replies      
I've always thought of upvote/downvote as a "thumbs up" or "thumbs down" - do I like your comment?

Sounds like the new algorithm penalizes disrespectful/spammy comments, rather than the "difference in opinion" comments (which is good). Could a 3rd option be added to differentiate this, though? Have option for upvote, downvote, and mark as spam (I'm thinking a "no" symbol).

dkarapetyan 3 days ago 0 replies      
Awesome. Keep up the good work. I am definitely enjoying the new HN much more. The quality of articles is way up and the comment noise is way down.
ballard 3 days ago 0 replies      
Definitely gotta give you guys a standing ovation for yeomen's work.
darkstar999 3 days ago 1 reply      
When (if ever) do I get a downvote button?
brudgers 3 days ago 0 replies      
It might make sense to increase the amount of time in which a negativemy scored comment can be edited or deleted.
robobro 3 days ago 0 replies      
Thanks, guys - didn't come to say anything more
bertil 3 days ago 0 replies      
> specifically, they dont silence minority groupsweve looked into this

I would love to have more details about that: what do you define as minority, and how do you measure silencing.

borat4prez 3 days ago 0 replies      
Can I use the new HN comments algorithm on my new website? :)
Igglyboo 3 days ago 0 replies      
Could we please get collapsible comments?
darksim905 3 days ago 0 replies      
Wait, you can downvote?
larrys 3 days ago 1 reply      
"It's clear from the data that they reliably downvote jerks and trolls"

Most people know what a jerk is. Perhaps though you (and others) could define what a troll is for the purpose of interpreting this statement. (Of course I know the online definition [1] but think that there seems to be much latitude in "extraneous, or off-topic messages" or "starting arguments".)

Specifically also from [1]:

"Application of the term troll is subjective. Some readers may characterize a post as trolling, while others may regard the same post as a legitimate contribution to the discussion, even if controversial."

While as mentioned I know what a jerk is, I can also see very easily someone throwing out "troll" to stifle someone else in more or less a parental way. That is to nominalize something as simply not important or worth even of discussion.

[1] http://en.wikipedia.org/wiki/Troll_%28Internet%29

pearjuice 3 days ago 3 replies      
Can anyone explain to me how this is not putting the common denominator in more power even further? At this point, unless you extensively agree with the majority of the echo circle, I doubt you will be able to have any impact on discussions.

Every thread is a rehearsal with same opinions at the top over and over and non-fitting opinions float to the bottom. In which turn, they get less "downvote-power" so they will stay low and can't get their peers above. I am not saying that the current flow of discussion is bad, I am just saying that participation is flawed.

We are simply in a system where you get awarded to fit to the masses and you get more power once you have been accepted into the hive-mind. A circular-reference at some point.

Quantum Entanglement Drives the Arrow of Time simonsfoundation.org
295 points by jonbaer  4 days ago   142 comments top 35
cromwellian 4 days ago 6 replies      
This Google Talk https://www.youtube.com/watch?v=dEaecUuEqfc uses entanglement and quantum information theory in a clear and understandable way to explain 'spooky' quantum phenomena, like the quantum eraser, de-coherence, the aspect experiment, and the measurement problem. Even if you don't know any QM, just basic algebra and calculus, it's really approachable.

I used to be a fan of the Many Worlds interpretation, but after seeing this, I'm now a big fan of the Quantum Information Theory explanation. Starting about 43 minutes in, he goes into the QM Information Theory explanation, but I'd recommend watching the entire prezo.

Link to my original post on the subject: https://plus.google.com/110412141990454266397/posts/HC49S9ip...

tim333 4 days ago 6 replies      
The reasoning sounds a bit iffy as in:

Finally, we can understand why a cup of coffee equilibrates in a room, said Tony Short, a quantum physicist at Bristol. Entanglement builds up between the state of the coffee cup and the state of the room.

I think you can understand coffee cooling quite well without any quantum stuff - the atoms in the coffee are moving faster than those in the room. There will be a tendency when one impacts with an atom of the air in the room for that to speed up and the coffee atom to be slowed.

Actual quantum entanglement is a strange and interesting thing. It's a shame people tag the term on things it is not really relevant to try to sound impressive for the most part.

dalek_cannes 4 days ago 1 reply      
Do we need entanglement to explain the Arrow of Time? Even though in classical mechanics, the past and the future are both equally observable, we remember the past and not the future because the future does not contain certain information yet -- the information to be introduced into the universe in the form of quantum fluctuations. One could even argue that all information in the universe was created at some point in time due to one quantum event or other.

I may have misunderstood though (I'm not a physicist). Entanglement does however, explain why systems tend to equilibrium rather than any other type of state as it evolves forward in time.

On a related note, I found this quote interesting. It reminds me of how HN comments about quantum information theory has a tendency to get downvoted:

> The idea, presented in his 1988 doctoral thesis, fell on deaf ears. When he submitted it to a journal, he was told that there was no physics in this paper. Quantum information theory was profoundly unpopular at the time, Lloyd said, and questions about times arrow were for crackpots and Nobel laureates who have gone soft in the head. he remembers one physicist telling him.

pygy_ 3 days ago 0 replies      
It reminds me of another paper that was discussed here earlier (I can't find the submission, though):

A quantum solution to the arrow-of-time dilemmaLorenzo Maccone

    The arrow of time dilemma: the laws of physics are    invariant for time inversion, whereas the familiar    phenomena we see everyday are not (i.e. entropy    increases). I show that, within a quantum mechanical    framework, all phenomena which leave a trail of    information behind (and hence can be studied by    physics) are those where entropy necessarily increases    or remains constant. All phenomena where the entropy    decreases must not leave any information of their    having happened. This situation is completely    indistinguishable from their not having happened at    all. In the light of this observation, the second law    of thermodynamics is reduced to a mere tautology:    physics cannot study those processes where entropy has    decreased, even if they were commonplace.
Phys. Rev. Lett. 103, 080401 Published 17 August 2009

http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.103... http://arxiv.org/abs/0802.0438

sheerun 4 days ago 3 replies      
I love following article: http://www.flownet.com/ron/QM.pdf

It basically shows that observation (measurement) and entanglement are the same things.

Think about it: particles are not magically going out of superposition as we observe (measure) them. We (our atoms) become entangled with those particles, we become superposition. It's just propagation of entangled state.

Why we don't perceive ourselves as in superposition? "It turns out that this result generalizes to any number of mutually entangled particles. If we ignore any one particle, the entropy diagram of the remaining particles looks like a system of N-1 particles in a classically correlated state with a non-zero entropy.". That means each atom of our bodies perceives other atoms entangled with it as they were not in any superposition (though as a whole, the system is still in superposition). We (atoms) are constantly entangled and in superposition with our environment, but we perceive it as classical state.

In what state each atom "sees" every other? According to probability. That's why in double slit experiment we see only one of most probable outcomes, not a random one.

Time could be rate of entanglement propagation. Entanglement propagates with speed of light (speed of particles), so we seem live in same timeline. But if something moves away from us with speed of light, the time for this object goes slower, but only relative to us.

Until two particles interact with any way, they live in totally different timelines. After they "observe" each other (entangle with each other), also their time becomes entangled. That's why after we see a cup begin dropped, it becomes part of our reality, and the cup becomes broken in our time.

We live in spacetime. As mentioned in article "Spooky action at a distance ought to be no more and no less) mysterious than the spooky action across time which makes the universe consistent with itself from one moment to the next.".

Why arrow of time? The article says: "Under QIT, a measurement is just the propagation of a mutually entangled state to a large number of particles. To reverse this process we would have to "disentagngle" these quantum states. In principle this is possible. In practice it is not.". I think differently though.

That are my thoughts. Please don't judge :)

elzr 4 days ago 2 replies      
This was surprisingly beautiful. As a geek in programming/computers/information/mathematics, but only a physics admirer from afar, it is very suggestive, even natural, to explain the deepest physical reality in terms of information:

"It was as though particles gradually lost their individual autonomy and became pawns of the collective state. Eventually, the correlations contained all the information, and the individual particles contained none. At that point, Lloyd discovered, particles arrived at a state of equilibrium, and their states stopped changing, like coffee that has cooled to room temperature."

Whats really going on is things are becoming more correlated with each other, Lloyd recalls realizing. The arrow of time is an arrow of increasing correlations.

The present can be defined by the process of becoming correlated with our surroundings.

mbq 4 days ago 7 replies      
This is nonsense; entropy and the arrow of time are essentially a many-body effects and require no quantum effects to occur. A simplest way to see it is to make small simulation of a, say, 1000 gas particles with only classical bouncing in a one side of a box partitioned in half with a barrier, obviously with a time-reversible numerical method -- after the removal of the barrier the gas will evenly spread over the box without any entanglement.
jostylr 4 days ago 0 replies      
From a Bohmian perspective, quantum mechanics consists of a wave function psi(q) that guides all the particles Q. The wave function is distinct from the particles. The particles are in equilibrium, relative to the wave function. It is the wave function that is not in equilibrium in its realm of states.

As it turns out, the usual psi^2 probability distribution of the particles is a reflection that the particles are in quantum equilibrium, that is, psi^2 is the natural measure in quantum mechanics for what equilibrium ought to be since it is the only measure preserved by the dynamics. And so if the particles start that way, they stay that way. And they are likely to start that way using psi^2 as the distribution.

There is actually a lot of subtlety involved in accepting that argument; I recommend http://plato.stanford.edu/entries/qm-bohm/#qr and an actual paper: http://www.ge.infn.it/~zanghi/BMQE.pdf

But what it implies is that the wave function is responsible for the arrow of time. It is a special state that evolves into a less special state. Presumably this is what their research is pointing at.

I would also comment that their description is exactly the classical explanation transferred to the quantum world (which it needs to be since our world is quantum). That is, we start in a special state and it evolves into a less special state because the less special states are more numerous and so more likely to be, all things being equal. And by more likely, we are talking 10^100 kind of more likely.

They still have the problem that the fundamental evolution of the wave function is time reversible. So if that bothered someone (it shouldn't), then their argument does not actually resolve that problem.

So I take from their work that what they are doing is getting the classical thermodynamic explanation (which is about volumes in phase space, not human ignorance) and translating it to the quantum theory. Neither wrong nor revolutionary.

TeMPOraL 4 days ago 1 reply      
Is this really new? IANAP, but I clearly remember being taught about the Arrow of Time as a probabilistic/thermodynamical phenomenon even in high school and I also read similar explanations that involved causality and probability theory without refering to quantum entanglement. Is the "quantum" bit even needed there for anything?
fspeech 3 days ago 0 replies      
The title of the article is unfortunate. Classical physics adequately addresses the arrow of time through thermodynamics (the second law in particular). What is missing is so called "decoherence". In other words, quantum physics is supposed to explain everything but when we interpret results we divide the world into classic (the observer) and quantum (the observed) parts. The answer, from reading the article, seems to be that even if the world is in a pure state (quantum) a large part of it could behave like a mixture state (classic observer/environment) through entanglement. This makes it easier to have a coherent mental picture of quantum physics.
SoftwareMaven 3 days ago 0 replies      
I wonder how much knowledge is lost because the research isn't "popular". What is the opportunity cost of so many researchers doing string theory research, not necessarily because they believe they'll find a breakthrough (obviously, this doesn't describe most), but because they won't be able to get published or find a research position if they aren't doing the "in" thing.

This doesn't just apply to physics, but the history of physics makes it easy to find case studies in this.

millstone 4 days ago 2 replies      
> After some time, most of the particles in the coffee are correlated with air particles; the coffee has reached thermal equilibrium.

No doubt this is some way oversimplified explanation, but it still makes no sense.

Say I have hot coffee and lukewarm coffee. The lukewarm coffee will equilibrate faster. Does it interact with the air faster? What if I bring in coffee that's the same temperature as the air, so that it's instantly at equilibrium. Does it interact with the air instantly?

throwaway7548 4 days ago 2 replies      
I have a question. I just went to a source of physical (quantum) randomness http://www.randomnumbers.info/ and I'm giving you a random number between 0 and 10,000 which I've just generated there. Here it goes: 6296.

Ok. Now that light cone had finally reached you. And you (neurons in your brain to be precise) are thoughtfully entangled with that random event (outcome), now in your past.

Now imagine the following. A few days passes. And you forget that number. A few years passes. Connections between the neurons which were storing this information are now gone. Molecules and atoms which were part of these neurons are gone from your body. There are no entanglements any more which link you to that event. Is that event in your future now? Again?

neolefty 4 days ago 0 replies      
From the article

  One aspect of times arrow remains unsolved.   There is nothing in these works to say why you  started at the gate, Popescu said, referring to  the park analogy. In other words, they dont  explain why the initial state of the universe was  far from equilibrium. He said this is a question  about the nature of the Big Bang.
Could it be that expansion, which proceeded much faster than light, therefore didn't allow entanglement to take place, delaying the heat death of the universe until everything is fully entangled?

If expansion had been slower, would entropy maybe have kept up with it, leaving us as just a single black hole instead of a dispersed, interesting, unentangled, things-are-still-happening universe 13 billion years later?

spcoll 4 days ago 1 reply      
The question of whether time is in fact directional is far from being closed, at least for quantum physicists. In fact, one of the physicists cited in the article is known for proposing a time-symmetric formulation of Quantum Mechanics [1].

[1] http://www.phy.bris.ac.uk/people/Popescu_S/papers/sandu_othe...

rturben 3 days ago 0 replies      
A lot of people in this thread are questioning how this adds any new information about time or how it is different/better than a classical explanation of systems (coffee cup reaching equilibrium based on thermodynamic laws).

One way that this result makes sense to me is by considering the properties of light speed and "spooky action at a distance." Particles become entangled with one another at the speed of light -- photons or fields carrying the information between the two. Looking at this from the perspective of light speed, there has been an instantaneous change between the two particles. State A has led directly to a more complicated, entangled State B. Still looking at this from light speed, there is no time between the transition from one state to the next and from that one to the next and so on. The universe has already worked itself out from the initial disentangled state to all the states that are increasingly more entangled.

Thanks to Einstein, we know that all objects try to move at light speed, but that the more massive they are the slower they become. Because we are massive objects, we don't experience time instantaneously like the photons do. We see the propagation of entanglement and see the state transitions. Our massiveness has given rise to a direction of time, the order that we understand the states of the universe to be proceeding in. Unlike light, we have to experience all the intermediate states in the order of less entangled -> more entangled. Thus an arrow of time.

This is already subtly bundled up in the classical explanations. Coffee cools off because it reaches equilibrium. Classical physics says this is because the particles in the coffee are hotter than the surrounding air, so it is more likely for those particles to break free of the coffee, thereby reducing its average kinetic motion. Consider though how those particles are interacting with one another. They don't just "know" the direction they're supposed to go, they bump into each other's fields and communicate at light speed. Each particle informs the next and as they become more entangled and learn more about where they are, they progress from state to state.

yati 4 days ago 2 replies      
I've always wanted to study quantum mechanics because of this very "entanglement". Can people please post recommendations on good resources/books on the topic for a person like me having no solid experience with physics(except college level courses)?
spikels 4 days ago 0 replies      
Quantum mechanics is where physics became more like mathematics: common sense no longer provides much guidance. It is really cool that it provides the missing explanation for one of the most common sense ideas in classical physics: the arrow of time.
Houshalter 4 days ago 1 reply      
Just a thought that I've been thinking about. Time has a direction because of causation. State1 causes state2 which causes state3 and so on. You get weird paradoxes if you allow causation to work in both directions. The universe would also have to magically align everything perfectly so that everything is consistent.

Another observation is that even with reversible laws of physics that can work in both directions, if you have a single starting state, all other states will causally propagate from it. In a single dimension of time/causation.

clavalle 3 days ago 2 replies      
Does it seem to anyone else that quantum entanglement and decoherance is the universes way of doing the least amount of computation possible? Like the universe is lazily loaded?
Tarrosion 3 days ago 1 reply      
I found the article rather confusing: it starts out by saying look, the laws of physics make sense forwards or backwards. But we only see one kind (entropy increasing) of process. Why is that? Entanglement.

Is there something about entanglement that is irreversible? As the article says "it is the loss of information through quantum entanglement, rather than a subjective lack of human knowledge, that drives a cup of coffee into equilibrium with the surrounding room." Okay, but then why don't we ever see the reverse making coffee depart from equilibrium? Something like the acquisition of information through breaking entanglement drives a cup of coffee away from equilibrium.

denom 4 days ago 1 reply      
In the article the author describes the notion of a "pure state" which is something that has independently evolving probability. Individual 'units' lose their pure state and become part of an entangled ensemble--move to equilibrium.

How is the evolution of biological organisms and technological systems explained in this sense? Played backwards, evolution would fit this and traditional notions of thermodynamic entropy. Is evolution a kind of de-entangling?

dominotw 4 days ago 0 replies      
I love this[1] 'arrow of time documentary' if you are looking for something fun to watch.https://www.youtube.com/watch?v=4BjGWLJNPcA
thibauts 4 days ago 0 replies      
So, if I get it right, states become more and more coupled, thus entropy tends to decrease in an open system ? I'm confused.
officialjunk 4 days ago 0 replies      
i recall learning that time "flows" both ways at the quantum scale, but i admit is has been a while since i've attended any lectures. has there been any new discoveries to say otherwise? i think i've read about research of both time reversal violations and time-invariance at the quantum scale.

also, what are peoples' thoughts on time being an emergent property at the macro scale and that down at the quantum level, everything is described by time independent equations, like the Wheeler-DeWitt equation? http://en.wikipedia.org/wiki/Wheeler%E2%80%93DeWitt_equation

wmnwmn 3 days ago 0 replies      
It's never been clear to me how deeply the thermodynamic quantities are really connected to time. For example there could be a state in which entropy (or entanglement) increased from left to right in space, yet it wouldn't mean that time flows from left to right.
analog31 4 days ago 0 replies      
Should I be looking for Planck's constant in the equations of thermodynamics?
one-more-minute 4 days ago 0 replies      
This is an interesting step, but doesn't actually explain why time is asymmetrical. Ok, so things equilibrate as time moves forwards because they entangle as time moves forwards. But this just shifts the question why is entanglement asymmetrical when time, when the underlying laws are not?

You still have the same problem: if you reverse time, the states become untangled and the coffee heats up.

It's nice to be able to model this from a quantum perspective, but make no mistake no philosophical issues have been resolved here, and we don't "finally" understand anything we didn't before.

homerowilson 3 days ago 0 replies      
David Ruelle elegantly discusses similar ideas in a great nontechnical book called Chance and Chaos (1995). Worth reading.
EGreg 4 days ago 0 replies      
Wow, just today I read this:


and I thought it was all explained quite simplyand now this?

EGreg 3 days ago 0 replies      
The real question is, given the second law of thermodynamics, where did all the order come from in the first place?
EGreg 4 days ago 1 reply      
I thought it was the second law of thermodynamics that already explained the arrow of time? Well, that and friction?
softatlas 4 days ago 1 reply      

    The rate of information increases.
Hence why

    Information wants to be free.
Parasitic on

    Only information explains its own existence.
Which all, I think, intuitively follows from Spinozist/Cartesian "Conatus" principle. That is to say:

    The order and connection of ideas is the same as the order and connection of things.
Some of us rave about this or that: "well, how many folk use X today" or "qualify as X" or "subscribe to X". But these expressions are all within the scope of multiply converging nexuses of increasing correlative potentia. The coffee cup is a simple example so like Wittgenstein's point: "if a lion could speak, we could not understand him". The lion, like the cup, has restricted correlative powers: these laws apply, these others do not.

The laws of information are laws about the dimensions of proportionality, which give the arrow of time an aspect of curvature (needing to exhaust a universe for exponentially narrowing arrows, so the onion-skinning of properties of a thing "come way may" at "frozen" temporal localities what happens when we "bend" time at certain family resemblance (physical) properties?).

salimmadjd 4 days ago 2 replies      
Seeing this article is rather bittersweet. I came to a similar conclusion in my college years but I never pursuit it further.

Taking Quantum Physics in college was a life changing experience and it reshaped how I viewed the world. I was always obsessed by time and one afternoon it became clear.

I explained my variation not as a cup of coffee but a handful of dice. Essentially every tick of time is rolling these dice. And the variation of dice from one combination to the next is the arrow of time.

Like one of the authors in this article, I got the most amount of resistance from physics major. For most part they had a dogmatic view of anything that they had not studied yet. If it wasn't in their books then it didn't exist.

I also came to the conclusion time travel as depicted in the movies will never happen. It can happen randomly in a smaller body but for anything large the arrow of time is almost impossible to reverse.

suprgeek 4 days ago 0 replies      
At or very near the Big Bang, the Universe was in a state of minimum Entropy i.e. minimum entanglement i.e. maximum order (in some sense).

Post Big bang the cosmological arrow of time is in the direction of increasing disorder i.e. increasing Entanglement i.e. decreasing order

On a smaller closed system, Before is when the system is more pure, less entangled, more ordered After is when it has become less ordered, more entangled.

Obvious really...

Ask HN: Your favorite YouTube channels?
288 points by stevenspasbo  2 days ago   127 comments top 86
petercooper 2 days ago 2 replies      
https://www.youtube.com/user/Confreaks record high quality videos at many programming conferences each year and then share the recordings on YouTube. So much to enjoy here, especially if you're open source leaning, like most of the events they do.

O'Reilly puts up lots of good stuff at https://www.youtube.com/user/OreillyMedia although the webinar recording quality leaves a lot to be desired. A real random set of tech topics though and often something worth watching.

Entrepreneur - https://www.youtube.com/user/EntrepreneurOnline - usually puts up lots of short videos with a business tip in or something. Sometimes longer interviews. Usually worthwhile if a little superficial at times.

The guy who founded Something Awful has a ridiculously addictive channel - https://www.youtube.com/user/lowtaxico - he generally plays horrifically poorly produced indie games with his sidekick Shmorky and I could listen to their absurdist banter all day.

It's a bit of a mish mash but https://www.youtube.com/user/Bisqwit always blows my mind when he does his coding videos such as coding a NES emulator in C++11: https://www.youtube.com/watch?v=y71lli8MS8s

https://www.youtube.com/user/Shmee150 is awesome if you're into supercars. He's currently doing a tour of European supercar events and factories putting up a video each day.

Far Lands or Bust - https://www.youtube.com/user/kurtjmac - is a guy who started to walk towards the 'far lands' in a Minecraft maps years ago. He's something like 10% of the way but is still plodding along recording his progress. This is a real pilgrimage with all the highs and lows that entails.

MrThaiBox123 - https://www.youtube.com/user/MrThaibox123 - is a British IT expert who seems to have an endless supply of cash to buy gadgets, phones, and amazing computer setups.. and he does incredibly well recorded reviews of them. He also has a vlog at https://www.youtube.com/channel/UCrXrOof3iFRZYJGqqApH3Ng which I find interesting to see behind the scenes of someone's life on a daily basis.

armansu 2 days ago 2 replies      
I have this hard-to-break habit of watching at least one startup/entrepreneurship/creativity video before going to bed at night, so I hope I'm somewhat qualified to answer this question. My personal favorites from the channels I'm currently subscribed to are (sorted by preference; in descending order):

- https://www.youtube.com/user/EverySteveJobsVideo - All the Steve Jobs videos in one channel

- https://www.youtube.com/user/1veritasium - Veritasium: an element of truth

- https://www.youtube.com/user/webofstories - Stories from Donald Knuth, Benoit Mandelbrot, Marvin Minsky

- https://www.youtube.com/user/PandoDaily - the fireside chats with Peter Thiel, Elon Musk, Fred Wilson, Brian Chesky, John Doerr, Tony Hsieh are especially recommended

- http://www.youtube.com/user/ThisWeekIn - my favorite episodes are those with Naval Ravikant, Phil Libin, David H. Hansson, Chris Sacca, Chamath Palihapitiya and Eric Ries

- http://www.youtube.com/user/ecorner - Look for the talk by Phil Libin

- https://www.youtube.com/user/bigthink - Larry Wall and DHH

- http://www.youtube.com/user/kevinrose - Ignoring the raccoon toss video :D

- http://www.youtube.com/user/AtGoogleTalks - Look for a converstaion with Garry Kasparov

- http://www.youtube.com/user/KasparovCom - Into the night with Garry Kasparov and Peter Thiel

- https://www.youtube.com/user/techcrunch - Dont laugh, but I love watching TC Cribs.

- http://www.youtube.com/user/UCBerkeleyHaas - Look for Guy Kawasaki!

- http://www.youtube.com/user/masterlock77 - Trial by Fire: Yabusame

- http://www.youtube.com/user/leweb - Look for Gary Vee!

- http://www.youtube.com/user/StartupGrind - Check out the fireside chat with Vinod Khosla.

- http://www.youtube.com/user/atotaldisruption - Justin Kan!

- http://www.youtube.com/user/500startups/ - Marc Andreessen & Dave McClure!

- https://www.youtube.com/user/building43 - small teams BIG IMPACT' by Robert Scoble

- https://www.youtube.com/user/stanfordbusiness - Look for the fireside chats with Elon Musk and Marc Andreessen

- https://www.youtube.com/user/princetonstartuptv - Princeton Startup TV

nostromo 2 days ago 0 replies      

Numberphile is a channel of really pleasant and interesting math videos. It's intended for a general audience; any level can enjoy it.

Perhaps you should show these videos to your kids too? When I was young I thought math was boring. It wasn't until college that I found out it was secretly very interesting.

vailripper 2 days ago 3 replies      
I enjoy woodworking when I'm not coding, so I have several woodworking channels I enjoy:

https://www.youtube.com/user/urbanTrash - Frank Howarth - Beautiful projects and his videos are very creative.

https://www.youtube.com/user/DrunkenWoodworker - Interesting work.

https://www.youtube.com/user/mtmwood - amazing geometric work

Goosey 2 days ago 1 reply      
Not exactly tech/programming channels, but really good brain snack food...

https://www.youtube.com/user/Vsauce --- IMHO the best youtube channel in existence. Every video is a rabbit hole of interesting questions and tangents with fantastic presentation and weirdly uplifting closing points.

https://www.youtube.com/user/pbsideachannel --- Smart thought provoking videos that use internet memes, gaming, anime, and such as the launch off points.

https://www.youtube.com/user/1veritasium --- Well presented science videos with a focus on the joy of learning.

gkya 2 days ago 2 replies      
- Art of the Problem, provides introductory videos on Information theory, https://www.youtube.com/channel/UCotwjyJnb-4KW7bmsOoLfkg

- Fosdem talks, https://www.youtube.com/channel/UC9NuJImUbaSNKiwF2bdSfAw

- Minimalist Programming with jekor, stuff on haskell, like a teardown of Pandoc, implementation of redo, Minimalist Programming with jekor

- Veritasium, mainly physics, https://www.youtube.com/channel/UCHnyfMqiRRG1u-2MsSQLbXA

- Vi Hart, the best thing about mathematics that's online, https://www.youtube.com/channel/UCOGeU-1Fig3rrDjhm9Zs_wg

- Brady Haran's channels on various scientific topics, http://www.bradyharan.com/

mkhattab 2 days ago 2 replies      
https://www.youtube.com/user/NextDayVideo --- Mostly Python talks at conferences, meet ups and other venues

https://www.youtube.com/user/Confreaks --- Like above, but mostly Ruby

https://www.youtube.com/user/emacsrocks --- Emacs Rocks (not updated frequently)

https://www.youtube.com/user/EEVblog --- Electronics Engineering Video blog. This is an excellent resource for electronics hobbyists. This doesn't cover programming much, unless it's micro controller firmware or FPGA programming.

ludwigvan 2 days ago 1 reply      
Some conference channels that are high quality:

- JSConf: http://www.youtube.com/channel/UCzoVCacndDCfGDf41P-z0iA

- InfoQ is very high-quality, especially for Java: See http://www.infoq.com/presentations/ See QCon videos, StrangeLoop videos)

- redev videos: http://oredev.org/2013/videos

- Channel9 by Microsoft has some top notch videos: http://channel9.msdn.com/ Don't assume that these are all .NET or Windows specific, for example here is one series on Functional Programming by Dr. Erik Meijer: http://channel9.msdn.com/Series/C9-Lectures-Erik-Meijer-Func... Lots of similar videos by Leslie Lamport, Rich Hickey, Simon Peyton Jones. See http://channel9.msdn.com/search?term=%22expert+to+expert%22 for example.

bane 2 days ago 0 replies      

Incredible indie singer, original songs and some ridiculously good covers https://www.youtube.com/user/mreebee3



Retrogaming/computing (somehow I find these endlessly relaxing):



https://www.youtube.com/user/CGRundertow the old videos where they cover the games are great and funny, they've moved their game content off of youtube due to overzealous copyright enforcement)

https://www.youtube.com/user/Chrontendo probably the most scholarly look at the NES ever made




https://www.youtube.com/user/MrGameSack (incredibly well produced show)

https://www.youtube.com/user/Gamester81 (another great show, guy also produces his own Coleco games)



https://www.youtube.com/user/lukemorse1 (a retrogaming legend, lives in Japan an fixes up old arcade games)

https://www.youtube.com/user/MetalJesusRocks (one of the best produced shows around)





Foreign Travel - Asia





Patrick_Devine 2 days ago 1 reply      
I've been trying to learn how to play Chess properly lately, so the two channels I'm subscribed to are:


ChessNetwork is run by a national master named Jerry who is absolutely hilarious.

networked 2 days ago 0 replies      
BSDs, game design and computer history:

https://www.youtube.com/user/bsdconferences/videos collects talks from various BSD conferences. An interesting non-technical talk from the collection is "A Narrative History of BSD" by Marshall Kirk McKusick (https://www.youtube.com/watch?v=ds77e3aO9nA).

https://www.youtube.com/user/Froblyx/videos lectures on game design and development by Chris Crawford (Balance of Power, The Art of Computer Game Design) uploaded by the man himself. "The Dragon Speech" of his can be found elsewhere on YouTube (https://www.youtube.com/watch?v=_04PLBdhqZ4).

https://www.youtube.com/user/VintageCG/videos early computer graphics demo reels, mostly from the '80s.

3rd3 2 days ago 0 replies      
Speaking of YouTube: Its possible to subscribe to your subscription list via RSS as described here:


Unfortunately there does not seem to be a way to get the watch later list as a feed. But youcan subscribe to custom lists like so:

Just replace PLAYLIST_ID accordingly. (Note that this is limited to 50 entries per feed.)

iglookid 2 days ago 0 replies      
I'm subscribed to ~200 youtube channels, and I highly recommend veritasium, Brady Haran's network, CGP Grey and minutephyics that others have already mentioned. Here are some other channels that don't seem to have been mentioned so far:


TommyEdisonXP - A jovial and friendly blind guy, who talks about, and answers questions about, how it is to be blind:


Arvind Gupta - He makes simple toys out of cheap materials, and explains the physics behind the toys. He does this full time, and works to popularize science at a premier Astrophysics and Astronomy research institute in India.


Grand Illusions - Like Arvind Gupta, this guy has collected toys and curiosities from around the world, and has dedicated a channel to document them:


Backyard Brains - They perform simple and interesting experiments on nervous systems of organisms:


Bite Sci-zed - A brilliant science channel run by self-confessed science nerd, Alex Dainis:


smalin / Music Animation Machine - Brilliant, brilliant visualizations of western classical music pieces, that help you understand the structure of the music much better if you're a beginner:



For example, see the 2nd movement from Beethoven's 9th symphony: http://www.youtube.com/user/smalin

Talking Animals - Human voices dubbed on viewer-submitted videos of pets. Funnier than you might expect!



I haven't sampled the following channels very well, but they seem promising.


Backstage Science: https://www.youtube.com/channel/UCP16wb-IThCVvM8D-Xx8HXA

It's Okay to be Smart: https://www.youtube.com/channel/UCH4BNI0-FOK2dMXoFtViWHw

Household Hacker: https://www.youtube.com/channel/UCI4I6ldZ0jWe7vXpUVeVcpg

The Slo Mo Guys - As seen on TV: https://www.youtube.com/channel/UCUK0HBIBWgM2c4vsPhkYY4w


Youtube "Leanback" for discovering channels and videos. Slick interface, but doesn't seem to surface quality content. Seems to just prioritize trending items.


herrherr 2 days ago 0 replies      
Currently without a doubt: https://www.youtube.com/user/mathematicalmonk

An extensive series about machine learning (100+ videos).

paddy_m 2 days ago 0 replies      
http://www.youtube.com/user/KEF791 Keith Fenner runs a machine shop and makes videos of his projects. He is thoroughly experienced and does some interesting projects.
Spittie 2 days ago 0 replies      
I don't follow many channels, but I'm always impressed by this guy: https://www.youtube.com/user/Fredzislaw100

He has lots of videos with circuits that seems impossible, but that are just full of hacks. He goes as far as putting circuits inside his components (leds, switches...).

corywright 2 days ago 0 replies      
My favorite YouTube channel is that of Matthias Wandel:

A software engineer by training, Matthias was one of the first 10 employees at RIM, and after he "retired" a few years ago he began making woodworking videos. His videos are great because of his background (he grew up working on his father's sawmill) and he brings an engineer's approach to woodworking. I've never done any woodworking, but I enjoy his videos because of the way he approaches and solves design problems.

Some of the marble machines he's made are incredible.

imkevinxu 2 days ago 1 reply      


I have a soft spot for RoosterTeeth, they make Red vs. Blue and hilarious Let's Plays and other geeky humor videos. It's one of those shows you can watch in the background while eating or something

ericb 2 days ago 1 reply      
The king of random. Hacks, experiments, explosions:


pdkl95 2 days ago 0 replies      
Periodic Table Of Videos


All the people posting Brady's [1] various channels (numberphile, sixtysymbols, etc) left out the best one: periodic videos. Not only is it interesting chemistry in the same format as numberphile/sixtysymbols, it also features the best mad-scientist HAIR on the planet [2].

[1] http://www.bradyharan.com/

[2] https://www.google.com/search?tbm=isch&q=Martyn+Poliakoff


Extra Credits


Incredibly detailed and insightful discussion of games from a what you might call a sociological perspective. They speak both as a game designer and as a player. A special emphasis is given to showing how "games" are a type of art, enabling certain new kinds of expression.

I'll caution that I don't mean "game theory" (Nash Equilibrium, etc) - Extra Credits discusses things like "interactive experience" vs "passive reading/watching", how mechanics can be used as a storytelling medium, industry issues, abusive (or just plain annoying) design choices, and theories such as the Uncanny Valley and the illusion of choice.

All packed into short ~5-6min, almost-animated, fun little videos.

daturkel 2 days ago 0 replies      
Frank Howarth does awesome woodworking projects and makes incredible videos showing how he makes them. Usually the videos are narrated with an explanation of the process, but sometimes he does them in a stop-motion style where you never see him at all, so the projects just build themselves: https://www.youtube.com/user/urbanTrash
jmpe 2 days ago 0 replies      
I'm mostly into hardware, these two provide me with top content:

Ben Krasnow, physics for the underfunded:


Ham Radio Now, lots of SDR talk and good content:


I have many others (e.g. CCC), but on mobile inside an observatory atm.

dannyking 2 days ago 1 reply      
If short educational videos are your thing, here's a pretty comprehensive list of the highest quality channels out there:

(My personal favorites are Vsauce, Veritasium, SciShow, Crash Course & CPG Grey)

ASAPScience - Fun, short interesting facts/explanationshttps://www.youtube.com/channel/UCC552Sd-3nyi_tk2BudLUzA

BigThink - Predominant people talking about interesting issues in short segmentshttps://www.youtube.com/channel/UCvQECJukTDE2i6aCoMnS-Vg

CPG Grey - an awesome professor talking about interesting factshttps://www.youtube.com/channel/UC2C_jShtL725hvbm1arSV9w

Computerphile - short videos explaining concepts in CShttps://www.youtube.com/channel/UC9-y-6csu5WGm29I7JiwpnA

Crash Course - beautifully designed courses for several subjects segmented into short videos. Highly recommended!https://www.youtube.com/channel/UCX6b17PVsYBQ0ip5gyeme-Q

Engineering explained - learn everything you wanted to know about car internalshttps://www.youtube.com/channel/UClqhvGmHcvWL9w3R48t9QXQ

IFLScience - short science news updateshttps://www.youtube.com/channel/UCvOTgnW7oj9ZWDd2y5TEApw

Minute Earth - beautifully animated short science fact videoshttps://www.youtube.com/channel/UCeiYXex_fwgYDonaTcSIk6w

Minute Physics - as above, but purely about physicshttps://www.youtube.com/channel/UCUHW94eEFW7hkUMVaZz4eDg

SciShow - this was one of the first short science video channels - awesome.https://www.youtube.com/channel/UCZYTClx2T1of7BRZ86-8fow

SciShow Space - as above, but about spacehttps://www.youtube.com/channel/UCrMePiHCWG4Vwqv3t7W9EFg

SixtySymbols - short videos talking about interesting symbolshttps://www.youtube.com/channel/UCvBqzzvUBLCs8Y7Axb-jZew

SmarterEveryDay - awesome science explanation videoshttps://www.youtube.com/channel/UC6107grRI4m0o2-emgoDnAA

Veritasium - very high quality science explanation videos - awesome guyhttps://www.youtube.com/channel/UCHnyfMqiRRG1u-2MsSQLbXA

VSaurce - mindblowing videos, usually around 10m, taking you on a tour of interesting facts and ideas. Check out Vsauce2 & 3 too.https://www.youtube.com/channel/UC6nSFpj9HTCZ5t-N3Rm3-HA

dfc 2 days ago 1 reply      
It is unfortunate that this "AskHN" turned into a link dump. This could have been a much more interesting discussion if people capped their list of favorites at two or three channels.
Oculus 2 days ago 0 replies      
I haven't been able to find any programming channels I'm absolutely in love with (comeback everyday to), but I do have a couple channels that keep me entertained:

Seananners - https://www.youtube.com/user/SeaNanners

GassyMexican - https://www.youtube.com/user/GassyMexican

TheMrSark - https://www.youtube.com/user/TheMrSark

d0m 2 days ago 0 replies      
That will go in the /whatever, but here's a very good less kownn metal guitar player.. https://www.youtube.com/user/charlieparradelriego/
wsc981 2 days ago 0 replies      
I like CodingMadeEasy[0]. It's made by a guy (college drop-out) that is working on his own start-up of sorts, trying to become a game developer. He has a nice MonoGame tutorial, for example. His other tutorials are mostly related to game development as well, I think. And certainly not limited to C#.

[0]: https://www.youtube.com/user/CodingMadeEasy

deadfall 2 days ago 0 replies      
Computerphile - british channel - professors/students/scientist talking about computers/programming/history https://www.youtube.com/channel/UC9-y-6csu5WGm29I7JiwpnA

edit -- whatever category

Yogscast - british guys playing computer games like minecraft (their arguments are very funny) - https://www.youtube.com/channel/UCH-_hzb2ILSCo9ftVSnrCIQ

news - VICE and VICE news channels for real journalism.

theboss 2 days ago 0 replies      
I follow two youtube channels very closely and I think HN should check them out. Both deal with Powerlifting. These two youtube channels are particularly interesting because they both follow two guys who work really really hard.

The first is Ben Rice's: https://www.youtube.com/user/Rev198The next is Pete Rubish: https://www.youtube.com/user/PeteRubish1

These guys are strong as hell and watching them continuously work hard to get such small returns (5-10 lbs) is really motivating for me.

prezjordan 2 days ago 0 replies      
Matthias Wandel (https://www.youtube.com/user/Matthiaswandel) has a woodworking channel that I consider to be a massage for my brain. I know nothing about carpentry, but his videos are so relaxing.

I also like carsandwater for his "Red-Hot Nickel Ball" videos. https://www.youtube.com/watch?v=9qSEfcIfYbw&list=TLIZX0Wqcq2...

Numberphile has some great recreational math videos. https://www.youtube.com/user/numberphile

And, as a few others have mentioned, Veritasium.

erikstarck 2 days ago 0 replies      
If you're in Sweden I hope you follow the Swedish version of hacker news on youtube, Hackernytt TV: http://youtube.com/user/HackerNyttTV
bowmanb 2 days ago 0 replies      
Some not-yet-mentioned I subscribe to:

Indie game dev: https://www.youtube.com/user/WolfireGames

Clojure talks: https://www.youtube.com/user/ClojureTV

Android: https://www.youtube.com/user/androiddevelopers

A filmmaker I enjoy: https://www.youtube.com/user/caseyneistat

[Shameless plug] Computer science paper presentations: https://www.youtube.com/user/PapersWeLove

tezza 2 days ago 1 reply      
Zero Technical Angle, yet still amazing: FailArmy


akhiluk 2 days ago 1 reply      
I'm more into general information on Youtube. Do check out Vsauce [ http://youtube.com/user/vsauce ] and CGP Grey [ http://youtube.com/user/cgpgrey ] if you haven't already.
takeoutweight 2 days ago 0 replies      
I have been enjoying learning Go (the game) and Nick Sibicky has a great lecture series. It's hard to find introductory material that goes deeper than just the basic rules of the game so this has been a valuable resource.


chestnut-tree 2 days ago 0 replies      
Not programming related: I like this recipe channel - recipes are filmed and posted every week. The presentation is clear and straightforward (with a dash of humour). There's plenty of variety in the recipes too: meat dishes, vegetarian, cakes, bread etc


nsxwolf 2 days ago 0 replies      
Matt Barton's "Matt Chat" has a wealth of interviews with early PC gaming legends. John Romero, Brian Fargo, and many many more.


unchocked 2 days ago 0 replies      

A jet engine mechanic way up in the Canadian north. If you've always wanted to get hands-on with a gas turbine, this is your guy.

sp332 2 days ago 0 replies      
I like freddiew's channel, RocketJump. He messes with the medium, check out http://www.youtube.com/watch?v=-e_NiwPz-MQ and the "behind-the-scenes" for example.
ivank 2 days ago 0 replies      
https://www.youtube.com/user/ThreadbareInc/videos is doing a detailed Let's Play of Deus Ex
captainmojo 2 days ago 1 reply      
If anyone wants to curate all these links as a group, here's a subreddit: http://www.reddit.com/r/hnyoutubechannels/
smoyer 2 days ago 0 replies      
The GoogleDevelopers channel you reference is great, but now that I've fallen back in love with JavaEE it has to be Adam Bien's channel: https://www.youtube.com/channel/UCksTNgiRyQGwi2ODBie8HdA.
EliRivers 2 days ago 0 replies      
World's most famous Australian, Natalie Tran. https://www.youtube.com/user/communitychannel

As I type this, the leading video is an unboxing :)

lateguy 2 days ago 0 replies      
Stanford E-corner:Knowledge and inspiration, one entrepreneur at a time. Stanford University's Entrepreneurship Corner offers videos featuring entrepreneurship and innovation thought leaders.


matiasp 2 days ago 0 replies      
tfn 2 days ago 0 replies      
If you're looking for talks on Java and related, you're missing out a lot of good videos from the JavaZone conference by limiting yourself to YouTube:


fasteddie31003 2 days ago 0 replies      

Applied Science is pretty cool. He is a master with electronics and fabrication.

rexreed 2 days ago 0 replies      
The digest of everything that happens @ TechBreakfast: http://www.youtube.com/tekbreakfast
d0ugie 2 days ago 1 reply      
This thread made me wonder if there's any way to export and import one's Youtube channels (as one could with RSS into an opml file), this is the most promising article I could find (but didn't test): http://www.iliketomakestuff.com/heres-how-to-export-your-you...

Any other leads?

pkrumins 2 days ago 0 replies      
MIT OCW is my favorite one http://www.youtube.com/user/MIT. I'm always on a lookout for new MIT courses.
sown 2 days ago 0 replies      
It's kind of interesting that so many people have included Youtube into their media consumption diet, perhaps in the place of Radio & TV?
ajayjain 2 days ago 0 replies      
https://www.youtube.com/user/sixtysymbols - really interesting and well made physics videos

https://www.youtube.com/user/flitetest - fixed wing and (multi)copter builds and flights. Mostly mechanical, electrical, and aerospace.

minutephysics, minuteearth, and crashcourse are also great.

inovator 2 days ago 0 replies      
Any one know a good iOS youtube channel?

I only have http://www.youtube.com/user/rwenderlich

thesoonerdev 2 days ago 0 replies      
Can I add videos from Vimeo? I love the Microconf videos on Vimeo, quite suited for the HN audience I would think.http://vimeo.com/search?q=microconfThe AMA by Peldi Guilizzoni (Balsamiq) is an excellent one.
jonmarkgo 2 days ago 0 replies      
shankysingh 2 days ago 0 replies      
For Hobby short-movie makers like me:1. Film Riot : https://www.youtube.com/user/filmriot2. Indy Mogul(now defunct): https://www.youtube.com/user/indymogul
hanley 2 days ago 0 replies      
For Python people, http://pyvideo.org/ aggregates videos from conferences and meetups.
d4mi3n 2 days ago 0 replies      
The Idea Channel is pretty good. It's funded by PBS and is generally geared towards presenting interesting topics for people to think about and discuss: https://www.youtube.com/user/pbsideachannel
tixzdk 2 days ago 0 replies      
For tougher treatment of complex math subjects I really enjoy matematicalmonk's Khan Academy style videos:


atmosx 2 days ago 0 replies      
Thanks for those links, I didn't even knew I could find so interesting channels on YouTube! :-
slvv 2 days ago 0 replies      
I'm amazed no one has mentioned The Brain Scoop yet! Natural history, dissections, all kinds of awesome stuff.


zamabe 2 days ago 0 replies      

Sixty Symbols

CGP Grey


Smarter Every Day

The Brain Scoop



Crash Course

(Sorry for the lack of links, but that takes forever)

(And the formatting. I don't know how to make it \n)

kplex 2 days ago 0 replies      
More Mythbustery goodness.


wskinner 2 days ago 0 replies      
Jamie452 2 days ago 1 reply      
hevsuit 2 days ago 0 replies      

Internet Culture and Tech Stuff

james-bronze 2 days ago 0 replies      
Thank you so much, I've been looking for good YouTube channels!
pepon 2 days ago 0 replies      
1Veritasium, SmarterEveryDay
InclinedPlane 2 days ago 0 replies      
- https://www.youtube.com/user/1veritasium (wonderful science education stuff)

- https://www.youtube.com/user/EEVblog (electronics stuff)

- https://www.youtube.com/user/lindybeige (irreverent but informative takes on historical stuff)

- https://www.youtube.com/user/bkraz333 (Ben Krasnow, amazing DIY home laboratory stuff)

- https://www.youtube.com/user/urbanTrash (Frank Howarth, fantastic wood crafting)

- https://www.youtube.com/user/minutephysics (well explained science stuff, see also:)

- https://www.youtube.com/user/minuteearth

- https://www.youtube.com/user/wickiemedia (if you've even been curious about pro audio, live or recorded, this channel has tons of great tutorials and explanations)

- https://www.youtube.com/user/setiinstitute (SETI talks, probably boring unless you're really engaged with cosmology, astronomy, exobiology, or space exploration, but if you are then there are some amazing talks)

Also, I've found defcon and ccc talks to have some amazing content occasionally. Try searching for "defcon" or "30c3" to get started.

j_s 2 days ago 1 reply      
Solidy in the 'other' category - any recommendations for kids?
whatevsbro 2 days ago 0 replies      
Go Performance Tales jmoiron.net
271 points by signa11  4 days ago   41 comments top 12
jws 4 days ago 2 replies      
I found the bit about Go using AES native instructions to accelerate map key hashing most interesting. This accounted for a >50% throughput increase when he found aws c1.xlarge instances which had AES-NI as compared to those that didn't.

This is the kind of detail most developers would not be aware of, and to be fair, even now knowing it exists the only reference I can google up at golang.org is the C source code of runtime/alg.c where you will see

    23if(use_aeshash) {    24runtimeaeshash(h, s, a);    25return;    26}
no hint that it might reduce your hosting costs by 33% or account for some huge variation in performance between one test machine and the next, or even individual runs if you are spinning up cloud instances to do your testing.

Does your CPU have the AES, SSE3 and SSE4.1 cpu capability bits all turned on? If so, you will hash mightily! Do you even know where to look to check?

jsnell 4 days ago 0 replies      
Just a note on the zlib optimization patches. The blog post is linking to an old version, there's a newer one from a month ago. Also, the patch still appears to be a bit buggy (e.g. corrupt output being generated by the new deflate strategy), so don't plan on actually deploying it.
DrJokepu 4 days ago 7 replies      
I find it interesting how insightful, technical articles like this receive hardly any comments while the usual "softer" articles that tend to dominate the Hacker News frontpage these days receive dozens if not hundreds of comments. I wonder what this says about us, HN readers.
ihsw 4 days ago 1 reply      
> Using a map[int]* Metric instead of a map[string]struct{} would give us that integer key we knew would be faster while keeping access to the strings we needed for the indexes. Indeed, it was much faster: the overall throughput doubled.

I'm a little sceptical of this -- type assertions are fast but it's an extra step to initializing a struct. It would have been nice to see tests done comparing map[string]struct{} to map[int]struct{} and comparing map[string]* Metric to map[int]* Metric.

Also, there is no way to escape an asterisk, so I apologize for the awkward space after each one.

SixSigma 4 days ago 0 replies      
If you want more details on Go profiling, this Go Lang blog post is a great place to look


timtadh 4 days ago 0 replies      
I have also played around trying to achieve a high performance trie in Go.[1] My approach is to use the Ternary Search Trie structure. Unfortunately, I have not yet approached either the performance of the native hash table or a hash table in Go (although Sedgewick tells us you should be able to beat a hash table). My TST does not yet have the Patricia Trie optimization (of collapsing long internal runs). Perhaps with that addition it will get closer to hash table performance.

Also everything he said about channels also holds true in my experience. I haven't tried writing a C library for Go yet but his discovery is pretty interesting for when I dive into that.

[1] : https://github.com/timtadh/data-structures/blob/master/trie/...

yukichan 4 days ago 1 reply      
The biggest performance issues I think that some people run into with Go involve reflection which seems to be slow. Something that does a lot of JSON parsing for example maybe could be much slower in Go than in Java, Python or JavaScript I think. I don't have any data, but I've known people to complain about it that work with Go. I wonder if a JIT or AOT compiler might help.
stcredzero 3 days ago 2 replies      
I've been looking into false sharing and other pitfalls of parallelism on multi-core systems.


It seems like our multi-core hardware isn't that well thought out yet. In particular, locks can cause false sharing. Even go channels are based on locks!

I would love to see some kind of annotation in golang for padding certain structs to cache-line boundaries. This value can be read from most processors, so it could be done in a cross-platform fashion. The gringo ring-channel library has to do its own padding to avoid false sharing.


    type Gringo struct {        padding1 [8]uint64        lastCommittedIndex uint64        padding2 [8]uint64        nextFreeIndex uint64        padding3 [8]uint64        readerIndex uint64        padding4 [8]uint64        contents [queueSize]Payload        padding5 [8]uint64    }
There was a similar technique for padding Java objects, but it now turns out that the JVM is smart enough to optimize the padding out!

awda 4 days ago 1 reply      
> All of our existing indexes were based on string data which had associated integer IDs in our backend

You already have a perfect hash function :).

logicchains 4 days ago 1 reply      
This is probably a stupid question, but I wonder if the author could have used slices instead of maps with integer keys. It would have used more memory, but it would probably also have been significantly faster. A significant proportion of the performance issues I see raised on the Go mailing list seem to involve maps.
sagichmal 4 days ago 1 reply      
Good article. It closely reflects the experience we've had at SoundCloud in our more heavily-stressed services.
wolf550e 3 days ago 0 replies      
Do they have to use zlib? For on the wire compression LZ4 is better, for archiving lzma2 or zpaq is better.
SpaceX CRS-3 Mission spacex.com
265 points by ColinWright  2 days ago   122 comments top 24
Arjuna 2 days ago 3 replies      
Here is a CRS-3 launch and ascent event list. Please note that all timings and values are approximate, as I have based them on a mix of CRS-2 profile telemetry and CRS-3 mission notes.

T-00:00:00 - Falcon 9 lift-off. Stage 1's nine Merlin engines produce 1.3M pounds of thrust.

T+00:00:07 - Falcon 9 clears the launch tower.

T+00:01:00 - Altitude: 6km, Velocity: 241m/s, Downrange distance: 1km

T+00:01:10 - Falcon 9 achieves supersonic speed.

T+00:01:23 - Falcon 9 achieves maximum dynamic pressure (Max Q).

T+00:02:00 - Altitude: 30km, Velocity: 1km/s, Downrange distance: 23km

T+00:02:30 - Altitude: 51km, Velocity: 1.8km/s, Downrange distance: 59km

T+00:02:41 - MECO (Main Engine Cut-Off) Altitude: 80km, Velocity: Mach 10

T+00:02:44 - Stage 1 separates from Stage 2.

T+00:02:45 - Stage 2's single Merlin engine ignites.

T+00:03:25 - Dragon's nose cone is jettisoned.

T+00:04:21 - Altitude: 148km, Velocity: 3.2km/s, Downrange distance: 346km

T+00:05:22 - Altitude: 182km, Velocity: 4km/s, Downrange distance: 541km

T+00:06:24 - Altitude: 200km, Velocity: 4.6km/s, Downrange distance: 767km

T+00:07:31 - Altitude: 210km, Velocity: 5.6km/s, Downrange distance: 1,080km

T+00:09:40 - SECO (Second-stage Engine Cut-Off)

T+00:10:15 - Stage 2 separates from Dragon.

Luc 2 days ago 1 reply      
That Russian spy-boat Elon Musk tweeted about seems to be moving away from Cape Canaveral:



Anyone know where the stage is supposed to be coming down?

bfe 2 days ago 1 reply      
Alan Boyle reports "first-stage reignited during descent and video was being sent back." https://twitter.com/b0yle/statuses/457248899464314880
rory096 2 days ago 0 replies      
>Data upload from tracking plane shows landing in Atlantic was good! Several boats enroute through heavy seas.



timw6n 2 days ago 1 reply      
Anyone know what the timeframe is for the test of the first-stage soft landing? The livestream was just showing the Dragon and seems to have ended now.
doe88 2 days ago 0 replies      
exDM69 2 days ago 1 reply      
Are there alternatives to livestream.com for the livestream?

It works only intermittently from northern Europe. It only plays back for a few seconds and then stops for buffering...

ColinWright 2 days ago 1 reply      
Elon Musk's twitter feed is worth a watch to get news:


xtc 2 days ago 1 reply      
Launch success. I'm incredibly excited to hear about first stage recovery attempt. This is going to be huge no matter the outcome.
sargun 2 days ago 0 replies      
It sounds like it was successful "Data upload from tracking plane shows landing in Atlantic was good! Several boats enroute through heavy seas." - Elon Musk

Link: https://twitter.com/elonmusk/status/457307742495993856

InclinedPlane 2 days ago 1 reply      
At an absolute minimum this flight has shown that SpaceX is capable of testing reuse of the first stage on operational launches (including use of landing legs on the first stage) with no impact to the launch (provided there is sufficient payload margin, which there will be on any further Dragon launches). That's a big deal, it means they get tens of millions of dollars in free testing subsidized by their customers, and that gives them a huge leg up in working towards reusability.
13throwaway 2 days ago 1 reply      
VLC/mplayer live stream

pip install livestreamer

apt-get install rtmpdump

pip install python-librtmp

livestreamer ustream.tv/nasahdtv best

ColinWright 2 days ago 0 replies      
Currently the countdown shows an intended launch at roughly 18:45 UTC, 19:45 BST. It may change because of the weather, so you'll need to keep an eye on it, or keep the window open and the sound on.
BrandonMarc 2 days ago 1 reply      
Would it be possible to add "live webcast at 2:30pm ET / 11:30am PT"?
mladenkovacevic 2 days ago 3 replies      
Is NASA ever this quick to restart a launch mission after a scrubbed attempt? 3 days seems blazingly fast considering the risks and possible consequences.
swatkat 2 days ago 0 replies      
Cool! Would like know how did the landing with legs go. By the way, here's the launch video:


"Dirty" water "geyser" rose up as high as the rocket itself :) Looks like water from acoustic suppression system had created a puddle under rocket?!

jhuckestein 2 days ago 0 replies      
Good news: Elon just tweeted "Data upload from tracking plane shows landing in Atlantic was good! Several boats enroute through heavy seas."
egwynn 2 days ago 1 reply      
The `Date` section says

  Fri Apr, 18 2014 2:45 PM EDT  Fri Apr, 18 2014 4:00 PM EDT
But the `About` section says

  ... targeted to launch on April 14 at 4:58 pm EDT ...
Looks like they copied and pasted the `About` info from their last launch.

th0ma5 2 days ago 0 replies      
New cubesats deployed too!
evanm 2 days ago 2 replies      
Is there a livestream of this?
avoutthere 2 days ago 0 replies      
Congrats to the folks at SpaceX on another great launch.
oneweirdtrick 2 days ago 2 replies      
If the launch is a success, how will the world react?
ColinWright 2 days ago 2 replies      
Is this submission being flagged by users, or has it tripped some sort of scoring penalty?


Dropping suddenly from 3rd to 20th on the front page suggests some sort of penalty being applied, but is it community driven, or automated? Even with the new openness about the actions of the moderators on HN I still find some things deeply confusing.

Added in edit: This does bring home just how important upvotes are. I've seen how disproportional the effect is of downvotes on an item's ranking - one downvotes out-weights many upvotes. If you like something, upvote it, or see it sink without trace.

Ubuntu 14.04 released ubuntu.com
261 points by pjvds  3 days ago   125 comments top 23
beaumartinez 3 days ago 5 replies      
Ubuntu 14.04 comes with Python 3.4, but unfortunately, it doesn't bundle the ensurepip module (and a host of others). By the looks of things, the idea was to use Ubuntu's own packages instead[1], but it didn't make it in time.

This means that pip being bundled by defaultone of Python 3.4 coolest featuresis missing. Trying to create a virtualenv using the bundled virtualenv module fails as well. Big mess.[2]

[1] https://launchpad.net/ubuntu/+source/python3.4/3.4.0-2ubuntu...

[2] https://bugs.launchpad.net/ubuntu/+source/python3.4/+bug/129...

g8oz 3 days ago 3 replies      
For those of you who find the Unity desktop to be an overweight example of "designer" solipsism, try out the alternative light weight desktops like Lubuntu (LXDE based) or Xubuntu (XFCE based).

sudo apt-get install xubuntu-desktoporsudo apt-get install lubuntu-desktop

jcastro 3 days ago 1 reply      
Just a reminder that we are now publishing Vagrant boxes as well:

- https://vagrantcloud.com/ubuntu

mrinterweb 3 days ago 2 replies      
For some strange reason, the 14.04 release is not showing up for me with "sudo do-release-upgrade". I have tried to update my primary software source to their main server and main for the US, and both did not show the availability of the new release.

As a friendly reminder, if you want to download the release as quickly as possible, use the torrent from http://www.ubuntu.com/download/alternative-downloads and be sure to seed for others.

foobarqux 3 days ago 1 reply      
If you use Emacs and depend on Ctrl-space note that you may have to disable the keybinding in ibus-setup and restart the ibus daemon. This bug was claimed to be fixed but it wasn't for me. However I'm running Xmonad with some Gnome utilities.
ziggamon 3 days ago 1 reply      
Tried to find some sort of release notes, best thing I could find was this:https://wiki.ubuntu.com/TrustyTahr/ReleaseNotes

If someone has a better link, please share!

ausjke 3 days ago 0 replies      
From the release notes:

"Hardware support - ARM multiplatform support has been added, enabling you to build a single ARM kernel image that can boot across multiple hardware platforms. Additionally, the ARM64 and Power architectures are now fully supported. "

Really? Can you do openstack with ARM/Power? what do you mean by "fully supported", does it mean ARM/Power/x86 all have the same set of packages? it has not been the case in the past.

wcchandler 3 days ago 1 reply      
Magnet URI for desktop amd64:


neverminder 3 days ago 2 replies      
I don't know about everyone else, but for me "released" means I can download it from the official location - http://www.ubuntu.com/download/desktop and that is not the case yet.
bluedino 3 days ago 3 replies      
Do people actually use jigdo?
orik 3 days ago 3 replies      
Can anyone comment on the difference between the PC and the Mac 64 bit images?
plg 3 days ago 0 replies      
it still shows beta at the link given above
bsg75 3 days ago 1 reply      
At this point this is a link to beta 2 from March.
pjvds 3 days ago 0 replies      
Desktop and server pages are now pointing to 14.04 as well:

[1] http://www.ubuntu.com/download/desktop[2] http://www.ubuntu.com/download/server

sobering 3 days ago 0 replies      
Anyone have any guesses as to when this will be available on DigitalOcean?

Edit: As of five minutes ago, it's showing up!

ateevchopra 3 days ago 1 reply      
Its great to see another LTS. Finally time to update 12.04 version.
arunk-s 3 days ago 1 reply      
Can anyone point the link to ubuntu gnome 14.04 ( final release ) ? All I can get is the trusty beta 2 on their official ubuntugnome.org.
marshray 3 days ago 0 replies      
Figures! I installed my first Ubuntu in years, 12.04.4 LTS, literally yesterday.
ForHackernews 3 days ago 1 reply      
What's the deal with this "Ubuntu Browser" being set as the default?

I feel like Canonical has a serious Not Invented Here problem, where they keep trying to re-invent the wheel by writing their own desktop environment, window manager, browser, startup system, etc.

homulilly 3 days ago 0 replies      
So can you move the application dock to the bottom of the screen without recompiling yet?
floor_ 3 days ago 2 replies      
Does Ubuntu still shill out personal data to Amazon?
kuchaguangjie 3 days ago 1 reply      
Since 10.10, when unity as desktop, ubuntu desktop is dead ... f..k Mark ...
gary4gar 3 days ago 0 replies      
Downloading via Bittorrent! Getting 12.04 LTS with great speeds.
Seznam, a Czech search company, previews 3D maps mapy.cz
258 points by rplnt  4 days ago   87 comments top 33
lubos 4 days ago 3 replies      
Honestly, I'm surprised to see Seznam on HN. I grew up on Czech internet in 90s and Seznam.cz (or "Directory" in English) has been huge for long time until Google has eventually beaten them. The vibe I get here from comments is like as if Seznam.cz is some new hot company while it is really a dying dinosaur like Yahoo.

Maps are not core competency of this company. They are early Internet pioneers maintaining huge portfolio of various services for almost two decades. Maps is just another service they are working on to keep users from leaving them for Google/Youtube, Facebook etc.

Btw, I spoke to Seznam.cz founder briefly once at some business event in Slovakia back in 2000 when I was 17

edit: their maps are created by Melown.com, see example https://www.melown.com/maps/

lars 4 days ago 1 reply      
The Norwegian site Finn.no got 3D maps that looked exactly like this back in 2008. [0]

As the link explains, the technology originates from the Swedish air force, and was meant to guide missiles through urban landscapes. It was since commercialized for civilian uses by the company C3 Technologies.

This looks like it's exactly the same technology.

[0]: http://labs.finn.no/sesam-3d-map-3d-revolution-the-people/

bhouston 4 days ago 1 reply      
I think that given that Google already has 3D depth coverage from its street view machines [1], it should be possible to combine that data with some medium resolution overhead 3D scans to create something similar, and likely even higher quality at the street level.

I wonder why Google hasn't done it yet. I don't think there are any real technical limitations. It may be that getting it fast is hard and the usefulness from an end user perspective isn't there yet?

[1] http://gizmodo.com/depth-maps-hidden-in-google-street-view-c...

zk00006 4 days ago 2 replies      
Based on the posts, people think that seznam.cz is a startup and Google will buy it like in 3,2,1. This is complete nonsense. Seznam is far from a startup and I am pretty sure their goal is not to get "only" acquired. Its mapping service is superior to google as far as Czech republic is considered. Well done guys!
suoloordi 4 days ago 0 replies      
Is this different than, Nokia's 3d Maps?This is Stockholm:http://here.com/59.3314885,18.0667682,18.9,344,59,3d.dayedit: I see this covers different regions in Czech Republic, whereas Nokia covers some well known cities all over the world.
fractalsea 4 days ago 0 replies      
I find this very impressive. The fact that you can rotate arbitrarily and see correct textures applied to all surfaces of buildings/foliage is amazing.

Can anyone provide any insight into how this is done? Is there a dataset which specifies the detailed 3D layout of the earth? If so, how is it generated? Is there satellite imagery of all possible angles? Is this all automated, or is there a lot of manual work in doing all of this?

chris-at 4 days ago 1 reply      
helloiamvu 4 days ago 0 replies      
Seznam is also working on 'Street View'. Check this out: https://scontent-b-lhr.xx.fbcdn.net/hphotos-prn1/l/t1.0-9/10...
robmcm 4 days ago 0 replies      
I hate the use of the history API.

I don't want the back button to navigate the map!

antjanus 4 days ago 0 replies      
Not in the time that I've started going here would I have thought that Seznam would make it here. You should check out their tile search feature!

They experiment a TON, all the time.

bitL 3 days ago 0 replies      
Congrats! Great job guys!

Just a few questions - what algorithm do you use for geometry simplification? Is it based on quadric error metrics edge collapses? How do you join tiles of different LODs? Any papers on reconstructing 3D from your drones?

kome 4 days ago 0 replies      
Far better than google, bing and apple maps. Nice work, seznam.

Why seznam does non exist in others European languages?

Czech republic is a little market, and if they focus just on Czech republic their economy of scale will be broke very soon. They need investment to update technology, but if their market is so little it became prohibitively expensive very quickly.

RankingMember 4 days ago 2 replies      
Very nice. I wonder where the source data (building textures, etc) came from.
adam1234 2 days ago 0 replies      
Hi, I'm from Seznam.cz and I'm in charge of Mapy.cz. Let me explain a couple of things and correct some misinformation:

1) As was already said above, Seznam.cz is not a startup, it's been here longer than Google, and it's one of the biggest Czech companies (over 1000 employees). In terms of monthly user counts, it is still number one on Czech internet, in front of Google.

2) the imagery is not taken by drones, nor helicopters, but airplanes. Nobody is able to take imagery of large areas (hundreds of kilometres) by drones, it is only possible by airplanes, in today's state of technologies. We have our own imagery, not bought from anyone.

3) the 3D model is computed from the aerial imagery (ortofoto + oblique). No manual work is performed. It is a highly demanding computation, consuming months even on huge cloud of top-notch supercomputers - which is why it is not easy even for Google, to do it on large areas around the whole globe. The computation will run for almost one year, to create 3D model of thousands of square kms, which we plan to publish this year.

4) our maps earn money by selling our primary data, which we produce (ortofoto, oblique, Panorama, etc.), for commercial professional users (not only in Czech rep., but also abroad). The company as a whole earns money from selling media space on its 20-30 web sites (seznam.cz, firmy.cz, sreality.cz, novinky.cz, etc.). The company is highly innovative in terms of technologies (as I describe above), as well as in user interface (our GUI never copies other designs, we constantly create and test newer and newer GUIs).

Howgh :)

Piskvorrr 4 days ago 1 reply      
Why does the error message remind me of "This site is only accessible in IE5. Get it [here]"?

In other words, we seem to be rapidly drifting back into the Bad Old Days, when sites were made for a single browser? Not using Firefox? You're SOL. Not using Chrome? You're SOL elsewhere.

dharma1 4 days ago 0 replies      
same stuff as apple maps, nokia 3d maps - low flying planes and lots of photos. Apple bought a Swedish company from Saab to do this

Nice to see it can be done with a single UAV and camera. Is there any open source software doing this?

_mikz 4 days ago 1 reply      
Vypad to skvle. Looking great.
felixrieseberg 3 days ago 0 replies      
I actually think that this is slightly less detailed than the Bing Maps Preview, where I could see my friend's car parked in front of his research institute - I'm impressed that it's running in a browser though.


SchizoDuckie 4 days ago 1 reply      
Sweet holy mamajama.

have they actually scanned this? or are they generating this from google maps imagery?

SchizoDuckie 3 days ago 0 replies      
Someone please build a nextlevel Command&Conquer on top of this. that would be wicked.
hisham_hm 3 days ago 0 replies      
Unlike Google 3D Maps, this actually works on my computer!
tarikozket 3 days ago 0 replies      
Apparently we will see more real world cities in the future games, voila!
tomw1808 3 days ago 0 replies      
Instantly want to play SimCity again. It can't be just me.
vb1977 3 days ago 0 replies      
The model is calculated from aerial photographs. The software for this was made by Melown Maps, a Czech computer vision company. See their website http://www.melown.com/maps for more models.
tristanb 3 days ago 1 reply      
When is someone going to put maps like this into a flight simulator?
aves 3 days ago 0 replies      
The city reminds me of City 17 in Half Life 2.
ReshNesh 4 days ago 0 replies      
That's where I run. Very cool
evoloution 4 days ago 5 replies      
Would Google try to buy the startup, hire the developers, or just reinvent the wheel in-house?
matiasb 3 days ago 0 replies      
Almad 4 days ago 0 replies      
Thumbs up!
dermatologia 4 days ago 0 replies      
me gusta
toddkazakov 4 days ago 0 replies      
secfirstmd 4 days ago 2 replies      
Cool, I smell buy out in 5, 4, 3, 2, 1... :)

I like the idea of bringing back more of the contours into maps once again. The move to flat satelite and Google Maps style stuff has meant the act of being able to navigate based on most efficient effort (e.g across contours not just A to B) is rapidly getting lost.

The Invention of the AeroPress priceonomics.com
257 points by duck  1 day ago   124 comments top 15
drcode 1 day ago 4 replies      
I am not a coffee nut, but got one of these serendipitously as a present and can attest it makes a good cup of coffee. Here's some facts (maybe obvious to others) I found out after the fact that I wish I had known a long time ago:

1. If you've never owned an electric kettle, you may not know that they will boil water in mere seconds (!) and are a key part in making an Aeropress practical. (And they're cheap: http://goo.gl/8dVnP2)

2. If you ever get heartburn and drink coffee, you MUST buy one of these ASAP, it will help immensely. Fast-brewed coffees have very low acidity, so you should treat yourself with aeropress (or Starbucks Clover coffee, another fast-brewed method) if you have this problem... you will probably feel better.

3. That said, these things brew a coffee that is like a hybrid between espresso and drip coffee, so if that sounds bad/good to you, you'll feel about the coffee you get from this the same way.

4. An aeropress is the easiest to clean kitchen appliance I've ever owned. You will NOT think to yourself "boy this coffeemaker could probably use a thorough cleaning" every time you want a cup of coffee... no such anxiety with this device.

deckiedan 1 day ago 5 replies      
I've been using an AeroPress almost every day for over 3 years, and love them.

I've had only one issue with them, in that the inside has become scratched and discoloured, which somehow messes up the taste as well (after about 2 years use).

A totally worth-while purchase. I'm now on my second one, hopefully this lasts longer, but if not, I'll just buy another one in a couple years.

One thing I like is how manual the process is - I can make a much weaker cup of coffee for guests who don't like it so strong without it becoming bitter. For different types of coffee, I can brew it in different ways.

And it's odd, but even cheap pre-ground supermarket coffee tastes hugely nicer with the AP. These days, I buy pretty much whichever the cheapest fairtrade coffeebeans the supermarket has, and grind enough for each cup, and it tastes great.

thom 1 day ago 0 replies      
For my shame I now mostly use a Nespresso, but my Aeropress is certainly the best coffee _placebo_ I've ever used. I have no idea if the coffee was better or worse, but it certainly felt like I was sciencing up the perfectly crafted liquid with space-age tools. It's a lovely ritual, especially when it breaks you out of a hard problem at work to grind the exact right number of beans, measure the exact right temperature, steep for the exact right number of seconds etc.
guelo 1 day ago 5 replies      
I used an Aeropress for a while but now I'm back on a glass French press. Its easier to deal with, glass feels cleaner and more durable, and the coffee is delicious.
fieldforceapp 1 day ago 4 replies      
Just ordered my first AeroPress, wondering what type of grind works best?

> With his plans mapped out, Adler went to Westec Plastics in Livermore, California,> ordered $100,000 worth of molds, and put the invention into production.

Just how does one go about stepping up from something like 3D printing (manifold meshes [0] are easy) to something like injection molding?

[0] http://www.shapeways.com/tutorials/prepping_blender_files_fo...

misframer 1 day ago 3 replies      
If you're looking for a grinder to go along with the AeroPress, the Hario hand grinder[0] is a good choice.

[0] http://www.amazon.com/Hario-Coffee-Mill-Slim-Grinder/dp/B001...

udfalkso 1 day ago 2 replies      
Should users be concerned about pouring hot liquid into plastic and then consuming the result? BPA and such?
JasonCEC 1 day ago 0 replies      
For anyone interested in quantitatively measuring the flavor profile of your coffee, and seeing how dosing and other changes effect your perceived quality of your daily (hourly?) ritual....

My startup [0] makes an Android App just for the occasion! We run statistical quality control and flavor profiling for coffee roasters, beer brewers, and distillers, so your reviews may actually help your favorite coffee companies make better roasting and sourcing decisions.

[0] www.Gastrograph.com [1] https://play.google.com/store/apps/details?id=com.gastrograp...

pistle 1 day ago 0 replies      
Have this, enjoyed it. Cut coffee out as my caffeine deliver device, so I could gift one with a pile of filters.

Very good. You don't get the creme that you get with a French press, but it's still head and shoulders above drip. I had roughly equivalently good quality coffee over a wide range of beans and a fairly moderate range of grinds/grinders from espresso to fine grinds in the cheapy bean/spice grinders you can get at your grocery or walmart/target.

Cleanup and maintenance is better than my drip ever was and the process is quicker, somewhat fun even, and a little bit zen.

gingerlime 1 day ago 0 replies      
I'm yet to get an aeropress, but what was striking me most about it, was what I would consider an "internet-age packaging" - with a customer testimonial printed on the package. I thought it was brilliant and can't recall any other product that does it - then again, I don't shop much, so I could be wrong.

I wonder if they A/B tested it ;-)

SippinLean 14 hours ago 0 replies      
Someone PLEASE make a glass or stainless Aeropress. Aerobie refuses to, I don't want to mix hot water with plastics, and the plastic discolors.
blt 23 hours ago 0 replies      
I love the Aeropress! It's also lightweight and tough, so you can take it camping. A big upgrade from instant coffees.
dhughes 1 day ago 1 reply      
I wonder if I could just use a large bore syringe and some cotton wadding?
elwell 1 day ago 1 reply      
I use the inverted method, which is pictured in the first photo. It helps you get all the oils, which float to the top.
jevin 22 hours ago 1 reply      
I'm curious as to how this compares to a percolator. Anyone tried both?
The Government is Silencing Twitter and Yahoo, and It Won't Tell Us Why aclu.org
256 points by e1ven  1 day ago   91 comments top 18
sage_joch 1 day ago 8 replies      
If Twitter and Yahoo really wanted to disclose this information, there is nothing the government could do to stop them. Civil disobedience is one of the most important tools we have against Orwellian governments. And doing the right thing is infinitely more important than following authoritarian orders.
binarymax 1 day ago 4 replies      
Here is what I don't understand: actions like this takes more than one person. It takes hundreds of people, complicit in actions that are the opposite of what anyone would consider 'American'. It's plainly fucking unamerican. So who the hell are these people? Why are they getting away with it? Sure covert things have always been going on to protect the public...but this is so in-your-face against what we should stand for that I cannot fathom how it is allowed to continue. I am so very angry that things like this are happening. The worst part? we can't seem to do anything about it so we just end up whining on the internet.
dm2 1 day ago 4 replies      
I don't know what information they are trying to suppress but in my opinion gag orders are wrong and the US government should have to thoroughly justify any action that goes directly against the US constitution.

Say the plans for the US Navy's railgun, or advanced nuclear specifics, or advanced drone technology, or time-travel technology is leaked out, should the US government be able to do everything in it's power to wipe that information from the internet to ensure that it has the #1 military in the world?

I don't know the answer to that question. But I do know that if a terrorist group got that information and used it to cause massive amounts of destruction that people would have wished the US government did more to protect it's military secrets.

The Patriot Act really messed things up.

Does anyone have a solution? Does it involve getting more ethical people into congress?

I have a solution, and it's called Online Voting. We are probably one of the few online communities that can actually make it happen, yet it's never even mentioned. Start by testing it in select states, then if it's successful move towards implementing it nation-wide. I have proposed it before but people just tell me how it's impossible because of security. There HAS to be a valid solution, being able to validate your votes, having numerous checks in place, having open-source code, I don't know, but hopefully someday we can all work together to make it a reality.

We could even work to eliminate Congress eventually, have every single thing that normally goes before Congress voted by individuals. Yes, the president could overrule votes (such as building a spaceship for trillions of dollars), but near unlimited transparency just seems to be the obvious answer. There will be a ton of resistance, because those in power don't want to give up power. Maybe I'm wrong, but I think we should at least try it.

artellectual 1 day ago 1 reply      
America is slowly turning into its own worst enemy. Is this what war on terror has lead to? "No one can terrorize our people, we terrorize our own people by slowly taking away all their rights and liberties" good job America.
ChuckMcM 1 day ago 1 reply      
This looks like another attempt by the ACLU to get the national security letter issue resolved (or undone). Its a worthwhile battle and I support them in it, but I'm not sure what this article brings to the conversation. It is yet another skirmish in the district courts.
yoamro 1 day ago 3 replies      
"In a democracy, if your government is going to gag someone from speaking, it should publicly explain why"

It seems like even the ACLU is saying "it's okay to censor people, just please tell us why". Im not sure what to even think of this.

Zigurd 1 day ago 0 replies      
It is very very hard to sympathize with the companies involved. If they wanted to, they could implement secure communication where ephemeral keys or user-controlled keys are used, and open source clients would guard against the placement of "bugged" client software. Then they could hand over cyphertext on demand, and not take on the job of supposedly protecting people who in their estimation merit protection. Protect all the bits.
higherpurpose 1 day ago 1 reply      
So, people still don't think they should prioritize getting their services from outside of EU, rather than US? It seems to me that despite all the laws and the Constitution, the government can still pretty much force any company do whatever it wants, "legally" or extra-legally (hello Amazon/Paypal/Visa/Mastercard!).

Post-Snowden, US companies don't deserve a second chance - at least not while the US government doesn't seem to have any remorse about mass surveillance and its abuse of power, and has no serious intention of reforming itself.

mtimjones 1 day ago 1 reply      
The Obama administration is the most open and transparent administration in history. Not.
hadoukenio 1 day ago 2 replies      
Stupid but serious question...

If I get a National Security Letter am I allowed to show it to my lawyer? If so, is there any limit to the number of people in my legal council? If NOT, what's stopping having everyone in the country as my legal council. Then, as I can only show the National Security Letter to my lawyers, I can then host the NSL on my website for everyone to see.

Crazy or illegal?

anigbrowl 1 day ago 0 replies      
(In one case, the government withdrew its gag order application after Judge Facciola invited Twitter's participation.)

...and? How did this play out in the context of the original subpoena?

xname 1 day ago 0 replies      
Chinese government can silence Twitter and Yahoo? Weird.
wavesounds 1 day ago 0 replies      
So the Government can't limit the amount of money corporations can spend on influencing political campaigns but, when it feels like it, it can silence them without even explaining why. Yeah that makes sense.
omarhegazy 1 day ago 0 replies      
I'm reading through these comments and really laughing my ass off at the bold, crazy rhetoric being used here.

Orwellian governments! Everyone is just a complacent cog! America is it's own worst enemy! No more freedom!

I mean, I admit, this specific act seems pretty dumb on the government's part. I have no idea why the government would silence Twitter or Yahoo. And given that they didn't care enough to explain, seems like it was probably a legal bug. Maybe some 60 year-old anti-Internet Congress member that thought SOPA was genius got cranky one day and sent some phone calls. I don't know; doesn't seem like anyone that truly has power in government cares, or else they'd succeed in silencing. This doesn't seem like "an extraordinary effort by the government"; this is the same government that dropped two nukes in order to end a worldwide war, so if they really cared about what your favorite anti-government Reddit liberal had to say in in a 140-character long witticism, they'd be able to really shut it down.

But I have a feeling that these alarmist and dramatic comments regarding FASCIST AUTHORITARIAN GOVERNMENTS HOLDING THE TRUE ARTISTS DOWN! have much more to do with the general reddit.com/r/technology culture of shitting on everything the government has been doing since last June. And that culture is much more retarded and out-of-hand than anything PRISM could possibly be.

It seems like what the government is doing with user data since PRISM is very similar to what advertisers have been doing with user data since Fucking Forever, the government doing it for it's ideals regarding terrorism and advertisers doing it for the moolah. But the point is the same -- massive data collection and other forms statistical analysis that you dramatic fucks label "spying" in order to seem passionate and cool has been going on for a while. Just that the government is a much bigger and more complex system than your average advertiser, so you seem like a hip and happening individual by attacking it.

Most people really don't give a shit about massive data collection. I mean, sure, everyone's a Reddit slacktivist nowadays, throwing around words like "spying!" and "privacy!" but no one really cares, or else we'd all be using rsync + ftp and BitMessage and all that idealistic free software stuff that RMS peddles. People just want to seem special and cool and smart and advanced when they post about how EVIL the government is for spying on all of us. But no one honestly cares, or else no one would use Facebook and Google and Apple products.

And should you care?

Is it really that significant, your tiny, indistinguishable contribution to our advertising overlords that isn't even tied to your personal identity? Is it really that creepy or a violation of privacy? It's not like the government knows that specifically Omar Hegazy and E1ven, they don't care about everyone's specific identities (but that would be truly creepy). I mean, even if they have your specific data tied to your real name, it's not like the NSA has people actually listening and looking at your conversation and spying on who you are and what you do. That would be statistically impossible. There are 316 million American citizens, 204 million e-mails sent per minute, and 1.26 billion Facebook users, and only 75 thousand NSA employees.

Could they really be reading all your e-mails tied to you the person, are they even capable of that? They wouldn't be able to spy on each and every one of you even if they tried, and that wouldn't make sense, either. So I'm pretty sure the only thing that knows about you you is the program transferring stuff from Google's servers to NSA servers, and you can trust that one to not be sentient enough to care.

So. They're not spying on individual people, cause they physically can't - but that would really be creepy. They're checking on aggregate statistics. And when you're just another brick in the wall of statistical analysis, is it really all that creepy? Do they really know all that much specifically about you?

But why are they looking at aggregate statistics, you ask?

Good question. Don't laugh -- I think it's terrorism ? I mean sure, that seems like such a cop-out answer from our perspective. But how do we know that the only reason that terrorism isn't a threat anymore isn't because of the American government putting it's foot down? Couldn't it be that the government's seemingly creepy obsession with fighting against terrorism is the reason Al-Qaeda and such have failed so hard that we just laugh at the possibility of them being a threat? If terrorism isn't a threat anymore, couldn't it be that shows that the government's way is working? I mean, this bin Laden guy. His family was filthy fucking rich, man. They were connected to Saudi royalty. These Al-Qaeda guys had fucking planes, man. And they reaaaaalllllly didn't like us. So obviously, if we just did nothing about it, they probably would've struck again ...

But I don't know. I haven't done enough research on this topic myself. Maybe the government's obsession with terrorism is a bit too much and while we should be worrying about this issue in order to keep it from happening, maybe 20x the military budget of the next 10 countries on the list combined is a bit too far. Or maybe it's just the right amount. I don't know, I read programming books in my free time, not political discourse.

But it's just so obvious from their alarmist bullshit that these Reddit libertarians haven't done their research either. They just want to seem like they have.

cordite 1 day ago 0 replies      
If he public knew the why, then they'd want to know more. So far the pattern of information control is to seal it at the most root position as possible.
ticktocktick 1 day ago 0 replies      
Think of the government as being a member of Hacker News with nearly infinite downvote power. If it doesn't like what you've said it disappears what you've said...often without even so much as an explanation.
dfa0 1 day ago 0 replies      
Why does a dog lick his balls?
ama729 1 day ago 1 reply      
I don't understand the outrage here, aren't gag orders pretty much standard police tool? I mean, if you are taping the new Al Capone/Ben Laden/Ted Bundy, then surely you don't want Twitter to reveal it no?
Ubuntu 14.04 LTS ubuntu.com
253 points by jacklight  3 days ago   159 comments top 28
etfb 3 days ago 7 replies      
Well, that was fun. The upgrade crashed halfway through. I rebooted in recovery mode, finished the upgrade using the dpkg repair option, and then restarted a few times and found that it still wasn't working. Looked like a video problem, so I checked /etc/X11 and found the xorg.conf files had been backed up and not replaced. A comment in the last one inspired me to run nvidia-xconfig and it worked fine. Pretty typical when you use Nvidia drivers, and still a lot less painful than upgrading Windows XP to 7.
samolang 3 days ago 6 replies      
The first comment I saw said it was impressive. The second said I should use Gnome instead of Unity. The third said I should try something called localepurge for a leaner install. The fourth said KDE was better than Gnome and Unity. The fifth recommended Xubuntu over Gnome.

This is the sort of fragmentation that makes popular adoption difficult, but is also what makes Linux awesome.

sandGorgon 3 days ago 5 replies      
I really recommend trying out the Ubuntu Gnome flavor [1] - I really like it as being more usable than Unity.

Plus https://extensions.gnome.org/ is incredible.

P.S. - [2] this is my personal optimization script for a lean and developer friendly Gnome Ubuntu 14.04 install. YMMV.

[1] https://wiki.ubuntu.com/UbuntuGNOME/GetUbuntuGNOME

[2] https://gist.github.com/sandys/6030823#file-lean_install_ubu...

josteink 3 days ago 1 reply      
As someone who just installed the beta 2 on my laptop a few days ago, I have to say I'm impressed.

This thing cold boots on my non-UEFI laptop in 4-5 seconds. That's at the same level as Windows 8.1, which also impressed me greatly.

Now if they can only get systemd and the "online in 50ms" updates implemented, this thing will be super-sweet.

pan69 3 days ago 5 replies      
I just took XUbuntu for a spin. It's just as great as the 13.10 release. If you're a GNOME refugee and looking for an excellent desktop then I can't recommend XUbuntu enough.
gkya 3 days ago 0 replies      
Wouldn't submitting a link to release notes than the desktop download page be better?


crb 3 days ago 0 replies      
Zardoz84 3 days ago 3 replies      
I really recommend trying out the Ubuntu KDE flavour (KUbuntu).I really like it as being more usable and configurable than Unity and Gnome.
NathanOsullivan 3 days ago 5 replies      
I really don't get the intention with the default visual style Ubuntu has settled on. I'm sure a lot of work has gone into it but it's just not attractive.

I previously thought it was growing pains and they would eventually land on a great style that was still "theirs", but at this point it feels like a lost cause. Personally I've stopped recommending Ubuntu on the desktop because I already know what the initial reaction to a fresh install is going to be.

tatqx 3 days ago 0 replies      
I love how the rounded window corners are now (finally!) properly anti-aliased [1].


neverminder 3 days ago 0 replies      
I've installed it today with full disk encryption option, but my keyboard wouldn't work on the password screen... It looks like this bug: https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1309246
fotcorn 3 days ago 0 replies      
Droplets on DigitalOcean are already available with 14.04 LTS. Now I only need time to upgrade our servers...
cies 3 days ago 1 reply      
It seems may use this opportunity to recommend an Ubuntu derivative. I'm very happy with Netrunner-OS[1]. It come with KDE and gets "sane defaults" right. AdBlocker, YT-downloaders, codecs, etc. -- all pre-installed.

And they also gave it some thought to make sure it looks good out of the box.

All Unity-refuge-seeking, but otherwise Ubuntu (Debian++) lovers should have a look at it. :)

1: http://www.netrunner-os.com/

bttf 3 days ago 0 replies      
Ubuntu had been collecting anonymized data and sending it off to Amazon* since 12.10? According to the this article* it is not included in 14.04.

[1] https://www.gnu.org/philosophy/ubuntu-spyware.html

[2] http://www.theinquirer.net/inquirer/news/2337185/ubuntu-to-d...

spindritf 3 days ago 0 replies      
I just upgraded my personal server from 13.10 and it was pretty painless. Although many third party repos are not yet ready for Trusty.
ing33k 3 days ago 2 replies      
Just upgraded from 12.04 LTS to 14.04 .already liking the locally integrated menus.
chmike 3 days ago 0 replies      
Failed to install here on my 13.10 (French) version because of an error related to an invalid ASCII code. I'll wait before trying again.
hnriot 2 days ago 1 reply      
nvidia and linux just don't seem to work together. I've been on 13.10 and looking forward to an upgrade in the hope that it might fix the problem where compiz freezes up the window manager for minutes on end. Linux with nvidia is about as unreliable as using windows which is very frustrating.

Anyone know if 14.10 works better with nvidia cards?

manish_gill 3 days ago 4 replies      
Meh, after struggling with linux for years, I finally swallowed my pride and paid a hefty price for a Macbook Air.

Best decision I ever made. OS X hasn't disappointed me yet. I don't have to worry about driver/sound problems or incompatible libraries every other week.

pmelendez 2 days ago 0 replies      
I just upgraded on of my servers and it became unresponsive after the reboot :(

I am able to ping it but every port look like closed. Anyone have had a similar experience so far?

donniezazen 3 days ago 0 replies      
I have been using Linux for a solid few years. I stopped using Ubuntu because I wanted more control and I like new technology. Now I am learning Android Development. I have less time, patience and require stability. I am thinking about switching from Fedora 20 to Kubuntu 14.04 for sake of stability.

Will it save me time and headache, it involves in getting things fixed on cutting-edge Linux and getting them to work, if I switch to Kubuntu?

ecocentrik 2 days ago 0 replies      
Why are people running OS upgrades on live production servers the day after release? Who does that? It's defiantly not people who care about stability. If you run into issues HN is probably not the right place to share them...
gd2 3 days ago 1 reply      
Any tips on how to make the update from beta 2 to final easy?

I want to keep all my apps, aliases, .bashrc edits, etc. Thankfully, I don't do this often enough that I remember the process at 7 am?s

troels 3 days ago 1 reply      
Anyone has a guess of when there'll be an official image on aws?
walshemj 3 days ago 0 replies      
so have they put back the key bits of xwindows they removed in a provious LTS - I was not happy after setting up my small hadoop home lab to find that some idiot PFY had removed teh fuctionaly that made remote login possible!
rohith_14_04 3 days ago 0 replies      
The first thing you have to do after you install Ubuntu 14.04 is Goto Software & Updates then choose drop down Download From: Other then on new window click "Select best server" , You will have a speedier installation of rest of softwares. I am sure you gonna do this for future releases once you compare the speed of suggested server (usually country specific)and best server :
shacharz 3 days ago 4 replies      
Why is the download so slow? Why not add a torrent link?
boohoofoodoo 3 days ago 0 replies      
Painless upgrade with no problems.
Those Who Say Code Does Not Matter acm.org
245 points by swannodette  4 days ago   263 comments top 53
ChuckMcM 4 days ago 5 replies      
That was a long way to go to insult C and brag about Eiffel. Ada doesn't have this problem either but nobody is jumping up and down saying how its the one true language. Back when I was looking at Phd topics (I ended up jumping into work instead) "provably correct" code was all the rage. Lots of folks at USC-ISI were looking into proving the code expressed the specification, and the resulting executable faithfully expressed the intent of the code. End to end correctness as it were.

What struck me about that work was that invariably there was some tool you ran over the specification and the code and it implemented some algorithm for doing the work. And yet if you went that far, then you should at least be willing to run something like lint(1) and had anyone at Apple run it, or made warnings fatal (always good practice), the repeated goto would never have escaped into the wild (useless code is useless, its a warn in both GCC and Clang, and always flagged by lint).

So is the challenge the language? Or the processes? I tend to favor the latter.

loumf 4 days ago 2 replies      
He's right, but he's disingenuous in saying that random line duplication can't cause catastrophic problems in Eiffel. This very specific bug can't happen in Eiffel, but the class of bug can (bug caused by bad merge or accidental, unnoticed line duplication).

If most code were idempotent, functional, immutable, etc -- then we'd start to get there, but usually randomly duplicating lines is going to be an issue unless it's always a syntax error.

I'd say clojure has more of a chance. (1) lots of immutable data and functional style (2) duplicating code lines is likely to result in unbalanced parens -- the unit of atomicity is the form, not a line. Many forms span lines in real code, and many lines contain partial forms (because of nesting).

Still there is plenty of clojure code that is line oriented (e.g. data declarations)

habosa 4 days ago 6 replies      
PSA: Always put brackets after your conditionals (for languages where you can). You never know when a one-line conditional will become a ten-line, and you can get this sort of bug. It's not worth the two saved keystrokes now to have the NSA in your data later.

I think the most readable code has no shortcuts and no tricks. I'll take unambiguous over concise or 'beautiful' any day.

freyrs3 4 days ago 1 reply      
Whenever "language doesn't matter" or "use the right tool for the job" is used in an argument it's quite often as a thought-terminating cliche used as a post-hoc justification for personal prejudices. I think almost everyone intuits that there is at least a partial ordering to language quality and safety, we just often disagree about how that ordering is defined.
byuu 4 days ago 12 replies      
I really don't see how enforcing {} syntax on all conditionals is going to make us so much safer.

Yes, people make mistakes, but this is a pretty huge screw-up. If you are modifying an unbraced if-statement and aren't paying attention to scoping, then you are being woefully negligent at your job. Especially when you are working on cryptographic code used by millions of people to protect their most valuable information.

So let's say we force more red tape to make sure this doesn't happen. Those of us who pay attention to scoping probably won't mind too much, it's good practice to do this anyway.

But what about the mediocre programmer? He may decide that now his if/else if/else three-liner, when adding new lines for {}, should really just turn into a switch/case. And now he neglects a fall-through case, or adds an unconditional break; before important code. And we're right back where we started.

It doesn't matter how much we safeguard and dumb down languages. We can load our languages full of red-tape: extra braces, no jumping or breaking, no fall-throughs, always requiring explicit conversions, no pointers, no null types ... all we'll end up with is code that is much harder to read (and possibly write), while the mediocre programmers will find new and inventive ways to screw things up. It's just as likely to make them even more lax, and attract even less disciplined programmers into important development roles. You know, since it's presumed to be so much safer now.

The real problem is the amount of poor programmers out there, and the lack of repercussions for these sorts of things. A doctor that leaves a scalpel in a patient is (rightly) ruined for negligence. Do you think the "goto fail;" writer even received a written warning? Why not?

I'm not saying people can't make mistakes, but I think your pay scale and the importance of what you do should come with some actual responsibility for your errors. Just like in every other profession out there.

Yes, sometimes you can blame the tool. But there are also times when you need to blame the user.

guelo 4 days ago 4 replies      
Since he lumps in Java with C and C++ it's worth pointing out that this specific bug is not possible in Java since unreachable code is a compiler error. I assume the same for C#.

Also, many style guides such as Josh Bloch's highly influential 'Effective Java' recommend against the 2-line if statement without curly braces since it's known to be prone to this type of error. His argument that keywords are better than braces for ending blocks is weak.

haberman 4 days ago 0 replies      
"When people tell you that code does not matter or that language does not matter, just understand the comment for what it really means, "I am ashamed of the programming language and techniques I use but do not want to admit it so I prefer to blame problems on the rest of the world", and make the correct deduction: use a good programming language."

As emotionally satisfying as it can be to stick it to people we disagree with, I think we as an industry could do with a lot less of this black and white thinking.

Programming languages do not fall into a neat good/bad dichotomy. Tell me your favorite programming language and I will tell you three things that absolutely suck about it (even if I like it overall).

Yes, if C could do it all over again it would probably mandate that brace-less blocks go on the same line as the "if" (or are disallowed completely). So I agree with the author that certain features of programming languages can make it more or less error-prone.

But people still use C for a reason. That reason is that C has real advantages. If you really want to improve software engineering, then help the Rust guys out, but don't just tell C users to "use a good programming language."

pjungwir 4 days ago 2 replies      
Paleographers have a whole catalog of scribal errors, which can be useful when trying to reconstruct a text from conflicting extant copies. Perhaps it would be helpful to compile such a list of common programming errors, and consider that list when designing a new language. It would include "scribal" errors like Apple's goto or = vs ==, and also low-level logical errors. It seems like this could make a fun paper for any CS grad students out there.
rguldener 4 days ago 0 replies      
Well the ironic part is that the official eiffel compiler compiles the eiffel code down to C, which is then compiled again into assembly. So technically speaking eiffel still relies on C... Note that its not very optimized C either, its much slower than Java for most of the stuff I tried (with contracts disabled).

Said compiler also happens to be terribly buggy and unreliable: The author still teaches the CS "Introduction to programming" class at my university with this language and every year students struggle with the language and the obscure IDE. Also don't know anybody that ever wrote a line of Eiffel again after that class even though the idea with contracts is kind of interesting.

Summa summarum: Best language constructs don't help if your basic tools are broken and make it a pain to write in that language.

AnimalMuppet 4 days ago 0 replies      
"My pet language renders that problem impossible."

Um... OK.

"Therefore you should use my pet language rather than one written in 1968 for a PDP-11."

Not so fast.

First, when the language was written has nothing whatsoever to do with how useful it is today. (Cue the Lisp advocates.) It's just a gratuitous slam, and it comes off as being petty.

Second, even if Eiffel does completely prevent this class of problem, what about the reverse situation? What classes of problems does Eiffel allow that other languages prevent? (Don't bother claiming "None". That's not realistic. It just means that either you don't see the flaws, or you're a propagandist.)

It's about the best tool for the job. Now, it's fine to argue that another tool would have been better for that particular job, but "avoiding one particular class of bug" is nowhere near good enough.

One point for the original article, though: Code does matter. Choice of programming language matters. Code reviews matter. Testing matters. Code review policies matter. Develop training matters. Developer culture matters. It all matters.

john_b 4 days ago 4 replies      
To address only the "goto fail" example the author uses, I don't see how the proposed Eiffel solutions are conceptually any different than always using brackets in your C/C++ constructs. Brackets are the mathematical notation for a set, and having a set of instructions inside them makes perfect sense even if the set only has a single element.


  if(condition){ instruction; }
instead of

  if(condition) instruction;
is already considered good practice, couldn't it also be enforced via compiler pragma?

ryanobjc 4 days ago 2 replies      
It's about both!

From a programmer point of view, the ideal is a language that doesn't let you make simple mistakes like the goto fail; bug.

From an engineering point of view, it's having the processes in place to make sure that when such bugs inevitably happen, they don't end up in the final product.

The reality is having both of these things would be ideal.

eldude 4 days ago 4 replies      
This is fundamentally what I call the Tower Defense[1] model, borrowed from my old manager here at LinkedIn, Rino Jose[2].

The Tower Defense model is an approach to software reliability that focuses on catching bugs at the lowest level, to avoid the inevitable combinatorial explosion in test coverage surface area. In other words, deal with problems like these at the language level, so there is NO NEED to deal with them at a higher process-level.

No one is disputing that processes, QA or Devops couldn't/shouldn't hypothetically catch these bugs before entering production. The problem of course is that they usually don't, because the lower level defenses are allowing too many bugs through, that they really shouldn't and the higher level processes become overwhelmed, fail, and allow bugs to cross the critical production threshold.

This means always giving higher priority to lower-level methods of reliability. For example,

* Language rules are more important than

* Unit tests are more important than

* Integration tests are more important than

* Code reviews are more important than

* QA is more important than

* Monitoring is more important than

* Bug reports

[1] http://en.wikipedia.org/wiki/Tower_defense

[2] https://www.linkedin.com/in/rinoj

jw2013 4 days ago 3 replies      
But the problem the author described in the article can be solved just by better programming habit- always add braces even for a one line condition statement.

Or just put the one line statement in the same line with the conditional statement such as: if(condition) statement; so when you try to add a line next time, you will notice it was a one line if statement.

But yeah, not explicitly requiring braces for one line condition statement can give us more succinct code but does requires better programming pratice.

darrencauthon 4 days ago 3 replies      
His code example is:

if (error_of_fifth_kind)

    goto fail;     goto fail;  
if (error_of_sixth_kind)

    goto fail;

My question: If the "truly important code" is really that important, where are the unit tests to verify that it "handles" the "non erroneous case????"

Test. Your. Code.

aetherson 4 days ago 1 reply      
It's clearly true that having a language which forced an explicit ending to an if block would have prevented the goto fail bug.

But is there any actual evidence that code written in modern languages have fewer bugs overall? Or is it all, "let's focus on this one example"?

As another commenter mentioned, the goto fail bug would have been utterly trivially caught by any unit test that even just tested that it reached the non-error case in any way (you don't even need to test the correctness of the non-error code).

I would like to see data before I believe that "errors that would have been prevented by non-C semantics" constitutes a significant fraction of all bugs, or that they aren't just replaced by bugs unique to the semantics of whatever replacement language you're using.

pron 4 days ago 0 replies      
Of course the programming language matters. The problem is, that there are many ways in which it matters, and some of those come at the expense of others.

A language may be more concise, leading to shorter code (which is good), but do so using tricks and "magic" that is hard to follow, which makes it, eventually less prone to analysis by others (which is bad). A language could be very declarative, thus clearly communicating intent (which is good), but do so with leaky abstractions that remove it away from the computer's view of things, and introduce subtle, and severe bugs that, in order to catch, require expertise in exactly how the language is implemented (which is bad).

So while there are certainly languages which are categorically better than others (at least for some domains), there is no consensus on the right path to choose among the "better languages", and they all take completely opposing approaches. I'd even say that most languages used in new code written today are among those "better languages". So while in theory the choice of the programming language matters a lot, in practice -- not so much (as long as you choose one of the many good languages). I don't think that we have a language (or several) that is that much better at preventing bugs as to offset any other advantages other languages may have.

dasil003 4 days ago 0 replies      
I absolutely believe that languages matter and are the best hope for improving code quality and productivity overall. We need better languages.

But the way we get there is not to pick some small but critical bug that could be avoided by an arbitrary style change and declare that a language which does not suffer that stylistic pitfall is superior. The new language may have much worse flaws. You're just playing language flaw whack-a-mole at that point.

If we want to improve we have to get a sense of what types of bugs are most common across all programs and reason from a logical standpoint about how they may be prevented. This will solve orders of magnitude more problems than fiddling around with the syntax to minimize the number of typos people make.

wellpast 4 days ago 1 reply      
This is terribly wrong.

Correctness is brought about by ALL of your tools in hand. These tools include unit testing, processes like continuous integration and code review, and so on, in addition to language features such as its syntax and static analysis capabilities.

The job of the programmer is to understand all of your tools and to then use them conscientiously and use them well. There is NO tool a programmer can't shoot themselves with. There's no prima facie perfect tool. And the combination of your tools is a better thing to evaluate anyway. A nail isn't universally useful. With a Phillip's head screwdriver things get a little better; but with a hammer, you'll start moving.

A good architecture and intelligent, disciplined execution is WAY WAY WAY more important than the specific tools we use. Arguments like this one are bike shedding.

TheLoneWolfling 4 days ago 3 replies      
How about forbidding newlines in a single-statement if and requiring a newline after the first statement after the if?

So this is allowed:

  if (foo) bar();  baz()
But this isn't:

  if (foo) bar(); baz()
And this isn't:

  if (foo)   bar();   baz()
Edit: formatting

AdrianRossouw 4 days ago 0 replies      
Code doesn't matter, what matters is what it does.

How usable, secure, stable or fast it is are properties of how well it accomplishes it's task.

There's an amazing presentation by the author of Clojure called Simple Made Easy. Since I couldn't just link people to a 1 hour presentation, I made some notes on it :


The code that we write he calls a construct, and our application is what he calls the artifact.

We need to evaluate constructs based on the complexity they create in the artifacts.

Using C, for instance, affects the complexity of our applications by introducing many more opportunities for human error.

MortenK 4 days ago 0 replies      
Some types of errors might be easier to make in one language versus another, but a language that through syntax eliminate the possibility of all errors, is of course a ridiculous notion. Cherry picking particular error types that are avoidable in the authors language of choice, does not prove anything.

The concept of why code does not matter, comes from development management literature. It's not a case of actually meaning "I'm ashamed of the language and the techniques I use". That's an awfully developer-centered point of view.

The influential factors of a successful software project are mainly the quality of the people involved. Next product scope and from there a huge drop down to development process, and finally technology, i.e language.

It's been statistically shown that barring a crazily bad technology choice (Visual Basic for the space shuttle kind of bad), language has very little influence on the success of a project.

That's of course not a nice thing to hear for a developer who's convinced his language of choice is the one true language. Regardless, it's well established knowledge, gained years and years ago through statistical analysis of thousands of projects.

nshepperd 4 days ago 0 replies      
The "improving our tools is no silver bullet, so let's keep using what we already have" thinking that this guy is criticizing is an excellent example of the fallacy of gray (http://lesswrong.com/lw/mm/the_fallacy_of_gray/). Of course, using Haskell doesn't prevent all bugs, and PHP doesn't always cause disasters. But it's easy to see which of these is the darker shade of gray. And there is a point where switching to safer (but not perfectly safe) tools is the right thing to do.
scotty79 4 days ago 2 replies      
Another thing that enabled this bug is that it was a language that allows misleading indentation.

Programmers indent all code. By making indentation to be not meanigful in your language you are ignoring large part of how programmers read and write code and allow for many misunderstandings between programmer and compiler.

acbart 4 days ago 0 replies      
A lot of the Computational Thinking movements seem to stress that "Computer Science is more than just computers!" And that's true, we have a lot more to offer! But at the same time, it's so misleading because so much of our really cool stuff is wrapped up in being able to program. I mean, CS is about solving problems at scale, and computers are how we best solve problems at scale. We can teach lots of carefully crafted things without them, but it's always going to ring a little false, and we won't get students more than an inch-deep into why we're truly cool.
ef47d35620c1 4 days ago 1 reply      
What language is he implying to be "good"?

It's obvious that he thinks C, C++, C# and Java are bad (due to syntax). The world mostly runs on those, so I guess we're all doomed. But if they are so "bad", then what does he consider "good"? I read it, but must have missed that part.

ww520 4 days ago 0 replies      
But Eiffel can't prevent the bug of swapping the order of two statements accidentally; it must be a bad language!

Come on. Cherry picking a weak feature of a language to invalidate the whole language is just disingenuous. All languages have strength and weakness. One has to weigh the positives and the negatives and decide whether it would work.

al2o3cr 4 days ago 0 replies      
I think you wind up with problems no matter what the tooling - for instance, a language that required that every line be provably terminating would never suffer from infinite loops, but whether the project using said language would ever halt remains to be proven. :)
anigbrowl 4 days ago 0 replies      
I found the article interesting, but I wonder why he didn't also discuss using 'switch/case' which would surely have been more appropriate than a succession of IF statements.

Of course you can screw things up with switch/case too, but in my limited experience that usually involves a design flaw rather than just a typo.

wpaprocki 4 days ago 0 replies      
There are other reasons why choice of language matters. If you need a simple web app, you're probably writing it in PHP or Ruby instead of C. But you'll likely use C if you're interfacing with hardware. A lot of apps that need high concurrency use Erlang. If you can write a quick Python script that solves a problem before you can even get Eclipse to load, then why would you even bother trying to write the solution in Java?

Language errors aside, it's pretty obvious that at least in some cases, the choice of programming language does matter.

Guvante 4 days ago 0 replies      
Alternatively use a whitespace sensitive language. Defining how to handle tab vs space is irritating, but there are plenty of solutions to that.
miscreant 4 days ago 0 replies      
On the surface, this reasoning makes sense. Unfortunately, human coders are the ones writing code. This means the code might have bugs in it, regardless of the language you choose. While it would be great to invent a new language n+1 whenever the blame for bad code can be directed at programming language n, it is not likely that you will find coders that are willing to repeatedly take that ride.
gweinberg 4 days ago 0 replies      
BTW, am I the only one who thinks that duplicating a line of code is 1) not all that common in the first place and 2) something that should FUCKING JUMP OUT AT YOU IN CODE INSPECTION LIKE A BLOCK OF ALL CAPS FUCKING FILLED WITH PROFANITY?

srsly, this was not a subtle hard-to-find error.

jheriko 4 days ago 0 replies      
Yet nobody is making a better c and we are stuck with performance vs safety.

Hit the nail on the head about if although I disagree about the fancy syntax examples - enforcing scopes is enough - every decent coding standard I've worked with forbids one line ifs... as does common sense acquired 15 years ago...

hyp0 4 days ago 0 replies      
That font - tiny and grey - was by far the hardest to read on Android (increasing the minimum font size to 20 pt fixed it).

It's considered good practice to brace if/loop clauses (unless on the same line) for this very reason. Not enforced, though I expect lint picks it up.

jds375 4 days ago 0 replies      
About the single and multiple expression if-statements, I couldn't agree more. Everyone says I'm an idiot always assuming multiple expressions (it is ugly), but in the end it is safer.
peterashford 4 days ago 0 replies      
Calling out Java for this issue is BS. Java doesn't allow unreachable code. If Java had a 'goto' instruction, that code wouldn't compile because the latter code was unreachable.

The article is poorly researched rahrah for Eiffel. Eiffel is a good language, but not for the reasons the author states.

joesmo 4 days ago 0 replies      
The article's suggestion is: "make the correct deduction: use a good programming language."

It would be enlightening to hear what Mr. Meyer or anyone else thinks would fit that bill on the Apple's platforms. Until the article actually provides a real solution, the article isn't making a point at all.

boomlinde 4 days ago 0 replies      
> Often, you will be told that programming languages do not matter much. What actually matters more is not clear; maybe tools, maybe methodology, maybe process.

In this case it could easily have been caught if they had full test coverage, whatever language was used, so yes.

maninalift 4 days ago 1 reply      
Surely the key point is that most of us read indentation first, it doesn't matter whether you are witting C with semicolons and curlies or ruby with no semicolons and "end"s or lisp with parens, what the programmer really reads first is the indentation. Those other things are sometimes checked afterwards.

Therefore there are two reasonable courses of action to prevent this kind of problem:

  * use automatic code indentation  * use a whitespace significant language
The second is absolutely the better choice. You may disagree but you are wrong. This is not a matter of taste, it is a matter of programming language syntax actually reflecting what the programmer reads.

clarry 4 days ago 0 replies      
You don't really know that. The brace could've ended up between the two lines and you have the same bug. Been there, done that.
abshack 4 days ago 0 replies      

    if( DoSomething() )        goto fail;    else if(DoSomethingElse())        goto fail;        goto fail;    else if(DoSomethingOtherThanElse())        goto fail;
You get a syntax error at the final else-if. A better way would probably be:

    int err = 0;    if(!err)        err = DoSomething();    if(!err)        err = DoSomethingElse();    if(!err)        err = DoSomethingOtherThanElse();    if(err) goto fail;    
I would prefer chainable commands that abstracts out the error checking though.

    err = Chain(DoSomething).Then(DoSomethingElse).Then(DoSomethingOtherThanElse);

sfk 4 days ago 1 reply      
Is it possible to prevent timing attacks, control secure wiping of memory etc. in Eiffel?
pjmlp 4 days ago 1 reply      
I wonder if Eiffel could have gotten a more widespread use, if Bertrand Meyer had gotten a major OS vendor behind it, instead of trying to cater for the enterprise.
hyperion2010 4 days ago 0 replies      
My take home is this. If we are going to do everything in our power to make these systems work better, then choosing or developing a language that is intentionally designed to draw attention to common security mistakes or prevent them structurally is a damned good thing to look in to. We will also do everything else in our power, but we had bettered put choosing or making that language on the list.
Dewie 4 days ago 0 replies      
With all the pitfalls of C, the unbraced ifs feels like a very impotent example. Are pretty printers never used? Maybe a good linter would be able to warn about this, too.
brianbarker 4 days ago 1 reply      
I agree that distinguishing between compound and non-compound statements is terrible. However, so is using whitespace-delimited blocks (sorry, Python). I wish he'd proposed the simple solution Go uses: require braces.
ebbv 4 days ago 0 replies      
Just use curly braces all the time. There's nothing in C or C++ stopping you from using curly braces around single line statements. It's just that you CAN omit them.

But having been bitten by this issue early on, I started always using curly braces. I think it's the better way to write C/C++. Frankly I think those who omit them are just lazy.

PavlovsCat 4 days ago 1 reply      
> "If you want the branch to consist of an atomic instruction, you write that instruction by itself"

No, I don't, I generally use curly braces for those too. So for me, the solution would be throwing an compile error when those are missing. Is that really all it takes to make a language "modern"?

I also don't understand the jab at semicolons, which I like, nor do I see how getting rid of brackets around the condition is really a net saving when you have to write "then" or "loop" instead. Apart from being twice as many characters to type, words distract me (much) more than symbols, and now that I think of it, I wonder how programming would be like with symbols for reserved keywords like "if" and "function", so the only text you see anywhere in your code would be quoted strings or variable/function/class names...

Anyway, I think when talking about distractions, one should be careful with claiming to speak for everybody (at least unless you did a whole lot of studying test subjects or something).

bowlofpetunias 4 days ago 0 replies      
As far as I can tell (and I may not be the best judge of that, because I've used a dozen languages over the past 25 years and still fail to see any really significant difference other than language philosophy as in OO, functional, etc), this is not about language quality but language safety.

Those two are not necessarily the same, and some of the most elegant languages aren't particularly safe. Conversely, unlike people like to claim based on hearsay and no longer existing features, a badly designed language like PHP is definitely not less safe than Python or Ruby.

By the standard for "good" set by the author, no dynamically typed language would make the cut.

autokad 4 days ago 0 replies      
i thought it was bad programming to not insert the {} even if the statement only had one line of code. is this not true?
justizin 4 days ago 1 reply      
blah blah blah acm blah
usumoio 4 days ago 3 replies      
I think this guy might be confusing code with style. This is why I never ever let the interns not use braces. We built a style guide that is good at blocking these types of errors. Not to mention that Apple should have tested enough to catch that error... It is about the code, but maybe not in the way this guy thinks.
Dropbox acquires Hackpad (YC W12) hackpad.com
228 points by yukichan  3 days ago   149 comments top 19
UVB-76 3 days ago 10 replies      
This is how it was always going to go.

Dropbox's core business is unsustainable, and they can't compete long-term with rivals like Google and Apple.

They're flailing in all directions at the moment; pushing for the enterprise/government market with the appointment of Condoleezza Rice, now burning a load of money acquiring businesses offering tangential services, in the hope they can diversify their business model.

It won't work. Acquisitions like this never go to plan, and they are almost always a waste of money.

amykhar 3 days ago 5 replies      
Just a little side note. I really wish people would give an overview of what their service is or does in press releases like this. Quite often, I see 'Facebook bought x' or 'Dropbox bought y' and I click to see what it is, and if I would want to use it. More often than not, there's no little blurb that lets me know what their product even does.
rattray 3 days ago 1 reply      
I think this makes a lot of sense for Dropbox. Documents are moving online, which means people won't need Dropbox for them.

I have a half-written blog post from months ago on why Dropbox should by Quip for this reason - they should be trying to leapfrog Google Docs to stay competitive.

Best of luck to the team!

ChuckMcM 3 days ago 1 reply      
One of those 'no brainer' moves, glad to see it got done. Love the irony of a YC exit as an acquisition by a YC company :-) Congratulations, hackpad is an awesome product and the combination with Dropbox has excellent potential.
xianshou 3 days ago 2 replies      
Acquihires are pretty much the default hiring method these days, so "victory" now requires keeping the product active after acquisition.

A toast, then, to Hackpad. Well done.

kunle 3 days ago 0 replies      
This is an exceptional deal. The hackpad team is awesome, the product makes sense, and I remember thinking after Box bought Crocodoc, that Hackpad would make sense as part of Dropbox, especially as it went enterprise and started competing with Word, Google Docs etc.

Congrats to the Hackpad team and to Dropbox here. Solid deal.

unhush 3 days ago 1 reply      
My favorite parts of Hackpad were the features that weren't intended to have mass-market appeal (ex: code syntax highlighting, markdown-inspired keybindings, ability to easily create/delete accounts). These will likely be gone in whatever notes product that Dropbox makes with the help of the (wonderful) Hackpad team.

So for me, this acquisition seems like a loss. I realize that Hackpad has said that they'll keep the site alive, but I expect it to be less functional if everyone maintaining it is a full-time Dropbox employee now. Fingers crossed that there will someday exist a good collaborative doc editor for hackers that doesn't fall over when >10 people connect or require a Google account!

Full disclosure: I have written code and done security auditing for Hackpad. I tried to get them to add vim mode. :)

jdp23 3 days ago 1 reply      
Congrats Alex and Igore, Hackpad is really impressive. Great move by Dropbox.

What's the state of the opensource alternatives? Etherpad development seems to have plateaued a while ago.

bitsweet 3 days ago 0 replies      
Hackpad is pretty awesome. Glad it will still be running after the acquisition. Congrats Alex & Igor
quadrangle 3 days ago 2 replies      
Ugh, Etherpad shoulda been copyleft. Hackpad is a travesty for being so non-transparent that they're just a tweak of Etherpad.
Shank 3 days ago 0 replies      
Please don't shut it down, like Readmill. I use Hackpad daily, and I'd hate to see it go by the way of Readmill and shoved down the toilet.
brianr 3 days ago 0 replies      
We love Hackpad at Rollbar. Congrats guys, keep up the good work!
matthuggins 3 days ago 1 reply      
Is the landing page terribly jittery for anyone else? I can barely tell what's going on and scrolling takes a bit to respond.
stillsut 2 days ago 0 replies      
DropBox looks like the company that walks on four legs, then two, then three. First, it was just a smart choice to replace emailing yourself files. Pretty soon, it will be an IBM, where some sales guy will convince your boss that the DropBox Q1000 is what your business needs for synergy and you'll end up having to use it.
dduvnjak 3 days ago 0 replies      
Well this is not very promissing: http://i.imgur.com/1vCvZzI.png
orik 3 days ago 3 replies      
Hackpad and Loom? Dropbox is on a bit of a feeding frenzy.
elliott34 3 days ago 0 replies      
hahahaha ctl-f for "journey"
Numberwang 3 days ago 1 reply      
Hackpad seems down at the moment. Wouldn't trust them with my documents.
matthewcford 3 days ago 1 reply      
Sounds like redirection for the Condoleezza Rice fallout.
British Path Puts Over 85,000 Historical Films on YouTube openculture.com
228 points by jamesbritt  3 days ago   35 comments top 10
hf 3 days ago 4 replies      
An impressive collection to be sure. Slightly hyperbolical, the British Path archive puts itthusly:

"This archive is a treasure trove unrivalled in historical and cultural significance that should never be forgotten."[0]

However, I am left wondering why "[u]ploading the films to YouTube seemed like the best way to make sure of that."Perhaps fittingly, there's no clear indication whichlicence, if any, is applicable.

What could've possibly impeded a parallelupload to the Internet Archive?


Killah911 3 days ago 1 reply      
While watching footage of when Hitler came to power, I got a pop up on YouTube to the effect of Obama wanting to take away guns & how I'd vote. It's nice they put it up on YouTube so the masses can see this amazing footage (I saw the Wright Brothers' flight for the first time there). But watching that ad pop up just drove home the point that we just traded humans for pigs to run our farm.
gdewilde 3 days ago 3 replies      
Oh.. moar awesome than the patent database, most don't get that either... innovation, technology etc (?)

Everything from Flying cars...


...robotic car parks...


.. the tarring of roads...


To the imagination of science...


gravity powered generators..


something called wind energy(?)


and the miracle of democracy...


v1tyaz 3 days ago 1 reply      
I don't understand the people commenting that YouTube is not a proper archival tool. Obviously. They're not deleting their own copies of these films, they're just making them available to the public in an easy to use manner. Criticism of this is totally misguided.
ithkuil 3 days ago 0 replies      
Here's another historical film archive: http://europeanfilmgateway.eu/
taivare 2 days ago 0 replies      
The Virginia 1967 ,reminds me of my youth when we still had a demand side economy and a thriving middle class.
maga 3 days ago 0 replies      
I wish they were with the original commentaries that press used at the time they first appeared. It would allow to see those "news" through the eyes of contemporary viewers.
TuxLyn 3 days ago 0 replies      
Very impressive collection indeed. Now lets wait another 40 years for modern 1977-1990 collection ^_^
samstave 3 days ago 1 reply      

dl-2-mp3() {

    #download and save a youtube video, and extract MP3 audio track    youtube-dl -x -k --audio-format mp3 $1

alias ytdl=dl-2-mp3

dublinben 3 days ago 1 reply      
It's too bad that this comes across as a marketing ploy. They're still charging for licenses to actually use any of this footage in any way. They haven't actually released this material under a Creative Commons or Public Domain license, so most of it is All Rights Reserved.

At least they don't have to pay for their own hosting now, to show off their video archive!

I wish I could read Wikipedia like this assaflavie.com
218 points by assaflavie  4 days ago   61 comments top 30
Svip 4 days ago 3 replies      
This already exists on Wikipedia.

Under preferences, select the Gadgets tab and enable "Navigation popups".

When hovering over a link in a Wikipedia article, a small popup will appear, in addition to some metadata about the link, the first paragraph of the article will appear, including links that may appear in that paragraph, which can also be hovered to make another popup appear.

ygra 4 days ago 2 replies      
Incidentally, this was what my diploma thesis was about. The basic idea was to have a single text that contains lots of possible branching or expansion points, depending on certain criteria. Depending on those criteria you'd get a expanded or contracted text to read that would be tailored to what you already know or are interested about. Altering the text in small parts (like [1] mentioned in another comment I found that one much later, though) was one part of it; inline expansions if necessary or wanted were a possible UX enhancement I thought of (avoiding a completely static text where you'd have to tell the engine everything beforehand).

Use cases I thought of were mainly adjusting texts explaining things (e.g. Wikipedia, school books, etc.) to the already existing knowledge of the reader. So that an article explaining a concept could look radically different (and going into increasingly more detail) depending on what the reader already knows. One proof of concept I created was adjusting the German Wikipedia article on Turing machines to three different levels (school, 1st semester CS student, 4th semester CS student) [75.3]. What each level does is either explaining things differently or leaving out parts altogether (no need horrifying a pupil with formulae). So my main focus was on providing something that reads well (expanding things inline still incur a context switch because they're not part of the original narrative) and finding a way how to model such things. Nothing automatic because a clear semantic model is needed for that to work.

Thesis can be found at [0.046], it's in German, but the abstract is in English too. Just in case someone might be interested in that.

[1] http://www.telescopictext.com/ mentioned by https://news.ycombinator.com/item?id=7602335

[75.3] http://hypftier.de/temp/turing.html sorry, it's in German, but Google Translate seems to work fine.

[0.046] http://hypftier.de/temp/Diplomarbeit.pdf

tty 4 days ago 6 replies      
Wikipedia currently has a Beta feature name Hovercards

You can enable it here


assaflavie 4 days ago 1 reply      
About popups/tooltips/hovercards: I considered this approach, but decided to go with this sort of "inline" expansion for the following reasons:

1. A popup obscures the original text. This interferes with context, albeit admittedly to a much lesser degree than full-blown navigation, but still.

2. Multi-level popup hierarchies just don't look good, in my opinion.

3. You must dismiss a popup to continue, whereas with inline expansion you can just as well keep the expansion and continue reading.

4. The hover approach doesn't work as well for touch-devices (i.e. without a mouse).

sksksk 4 days ago 0 replies      
I find the Google Dictionary plugin is pretty good for reading wikipedia like this...


Goladus 4 days ago 0 replies      
Any article on an advanced topic is going to be difficult if you lack prerequisite knowledge, no matter how fancy the cross-reference ui.
MichaelAza 4 days ago 2 replies      
Wish granted:http://wikiballoons.tomodo.me/wiki/Main_Page

I also made a (by no means complete) POC more simmiliar to your animation which you're welcome to fork and improve:http://en.wikimore.tomodo.me/wiki/Main_Page

In general, whenever you want to improve a website - Tomodo is the place to go (http://tomodo.com/)

Aardwolf 4 days ago 1 reply      
Since my primary language is Dutch, it's easy for me:

If I want a short article, read the Dutch version. If I want too many details, read the English version :)

alxndr 4 days ago 0 replies      
Similar, but apparently "one-way": http://www.telescopictext.com/
bachback 4 days ago 0 replies      
welcome to Ted Nelson's original idea of hypertext. further information here: http://en.wikipedia.org/wiki/History_of_hypertext
zodiac 4 days ago 1 reply      
I thought this was a really interesting idea and tried implementing it. You can check out the demo here:


Note that the "sinusoidal wave" link isn't working, as it redirects to "sine wave"...gotta work around that :(

If you want to read/modify the code send a pull request to github.com/zodiac/xuanji.appspot.com/, the commit which implemented this is https://github.com/zodiac/xuanji.appspot.com/commit/872b40dc...

lewispollard 4 days ago 0 replies      
I use the 'wikipedia quick hints' chrome extension and it works really well for me. It works kind of like the navigation popups already mentioned - but links inside the modal can also be hovered over, 'recursively'.


Holbein 4 days ago 1 reply      
While the suggested feature would be a improvement, the real goal imho should be that each Wikipedia article can be read and understood by itself, without the need to go on a recursive yak shaving expedition through other articles first.
epochwolf 4 days ago 0 replies      
If you have a mac, you can right click on any word and choose "Look up in dictionary". The dictionary pulls definitions from it's internal dictionary and wikipedia.
jasongrout 4 days ago 0 replies      
The American Institute of Mathematics has been promoting "knowls", which is a very similar idea and implementation: http://www.aimath.org/knowlepedia/

Rob Beezer's free open-source linear algebra book uses these extensively to provide more details in proofs and in exposition. It's very nice. For example, click on the underlined blue text at http://linear.ups.edu/html/section-SSLE.html

kubiiii 4 days ago 0 replies      
I like it a lot, my use of wikipedia often leads to a bunch of opened tabs and a lack of focus on what I was looking for in the first place (which might also be a great thing depending on your mood).
ozh 4 days ago 1 reply      
Neat idea, but the article also made me discover Basic English and its 850 words. Didn't know about that, I think that should be mentioned when you learn English.

Does anyone know of similar general purpose word sets for other languages?

enscr 4 days ago 1 reply      
I prefer popups (floating elements) instead of altering the flow of the document. The continuous re-flow of the document is distracting.
Raphmedia 4 days ago 0 replies      
Take a look at https://github.com/sanketsaurav/easywiki which do exactly what he proposes and work on other websites when hovering wikipedia links!
hngiszmo 4 days ago 0 replies      
I want my computer to get a concept of what I know. It should then gray out stuff I know and linkify stuff that might be new to me.Semantic Web is the key word here. Decades of research went into that but this magic tool still is not anywhere near yet.
Dewie 4 days ago 0 replies      
Yeah... try to look up some category theory terms you found in the wild on Wikipedia. You will not be enlightened.

I guess this can be generalized to most mathematics on Wikipedia.

mosselman 4 days ago 0 replies      
I like your concept a lot. It would be great addition to reading complex things.
softgrow 4 days ago 0 replies      
Maybe the prose could be written to assume less knowledge and reduce the amount of other material needed for most people to understand? Alternatively as a user preference the prose could auto-include the meanings of some terms?
brunorsini 4 days ago 0 replies      
This reminds me of Apture, which Google acquired in 2011.
Zakuzaa 4 days ago 1 reply      
tldr.io chrome extension does something like this
r0h1t4sh 4 days ago 0 replies      
It will be great to see this as a Chrome/FF extension.
dredmorbius 4 days ago 0 replies      
Tree-style tabbed browsing makes tree-traversal style web browsing far more tractable.

Sadly, Chrome doesn't have a tree-style tabs mode, nor can it be provided by the existing Chrome extension format. There are a number of tree-style tab extensions for Firefox (e.g., "Tree Style Tab": http://piro.sakura.ne.jp/xul/_treestyletab.html.en).

There are any of a number of changes to the browser model I'd very much prefer aimed at information management, as opposed to the web-applications centric focus of the past several years.

In an interesting departure, Kobo advertised among the features of its (Android-based) tablet a web browser which simplified pages to an easy-to-read view. Based on present pre-loaded apps bundles, it looks as if that was the Pocket browser, but I still find it interesting that this was pitched as a benefit to a largely nontechnical, literary audience. It certainly reflects my own growing dissatisfaction with present-generation browsers.

badgateway 4 days ago 1 reply      
Didn't Ted Nelson suggest this sort of in-situ display for transclusion in Xanadu?
api_or_ipa 4 days ago 0 replies      
This is why some IDEs allow you to to "peek" at a definition, for example.
PhasmaFelis 4 days ago 0 replies      
What we really need is for more Wikipedia writers to remember that it's an encyclopedia, not a technical reference manual.
Metacademy metacademy.org
219 points by edwardio  21 hours ago   32 comments top 21
rogergrosse 20 hours ago 5 replies      
Co-creator of Metacademy here. Colorado and I are intending Metacademy to be a package manager for knowledge, where you can easily find the particular thing you want to learn about (e.g. deep neural nets) without having to track down all the prerequisites (e.g. gradient descent) yourself. Basically, weve annotated a dependency graph for the core concepts in the field, and it uses the graph to produce step-by-step learning plans geared directly towards your goals.

Sorry if you can't reach the site. Were both doing this as volunteers, and I guess we werent prepared for the level of traffic were getting. In the meantime, you can find more details about our high-level goals here: http://hunch.net/?p=2714

If you dont have a particular goal in mind and just want a general overview, check out the roadmaps, e.g. http://www.metacademy.org/roadmaps/rgrosse/bayesian_machine_...

tiefenbach 2 hours ago 0 replies      
Nice work! I've been working on something very similar. My project still needs a lot of work (and talent, I started working on this site when I first learned html/css/js 2 years ago) but you can check it out at: http://subjectflow.com , or for a full tree: http://www.subjectflow.com/viewer.htm?=mewvzl2i6bt97gnjf9i4 click on the white dots to expand the tree).

I really hope to see this type of idea take off. I would love to see this sort of thing combined with simplified git related operations to make it easy to improve education content. Additionally I think it would be great to add in a comment system where you can leave a question connected to the exact text that is confusing you.

Link- 14 hours ago 0 replies      
IMHO, the dependency graph is pure gold! I think it should be the core focus of this project as it will be quite difficult to aggregate, clean and present all the needed information about a given topic (unless we're rebuilding wikipedia). It will be easier to link to credible resources available across the web through the dependency graph. Maintenance wise it would be much more interesting not to mention easier.
catshirt 1 hour ago 0 replies      
love this. i've wanted something like this for a long time. Kahn Academy mind maps are awesome but limited in terms of subject matter. having an open database of this is amazing; enables self learning while eliminating the problem of not knowing what you don't know.
nathancahill 18 hours ago 0 replies      
Been clicking around for 20 minutes. This is like TV Tropes for knowledge.
Houshalter 1 hour ago 0 replies      
I've been using this. It's extremely useful. I wish they did this for every subject.
klapinat0r 18 hours ago 1 reply      
How is "x hours to learn" measured? If I search for logistic regression, I get this page: http://www.metacademy.org/graphs/concepts/logistic_regressio... stating 1.7 hours - is that the estimated time to work through "Core resources", the "read" time + video length or something else?

Really exciting project, and very useful to me personally.

This is what wikiveristy should've been.

Good job guys.

dorian-graph 7 hours ago 1 reply      
I like the approach! I feel that search could use an improvement. I'm using k-means++ at the moment for something and searched for "k-means" which yield no results [1]. I clicked on the link to look at the full-list and found k-means mentioned 3 times, to my surprise.

[1] http://www.metacademy.org/search?q=k-means

rcthompson 18 hours ago 1 reply      
Is there a way to view the DAG of concepts itself?
riffraff 21 hours ago 0 replies      
this seem like an interesting idea, but it lacks some "high level structure", or possibly a better search.

For example, I am sure there are general topics for "classification", "recommendation", or "similarity", but I wouldn't know where to start.

Gatsky 3 hours ago 0 replies      
This is a dream, thank you.
paufernandez 17 hours ago 1 reply      
Nice! This is the first time I've seen the dependency graph implemented in e-learning! Actually I started a programming site in spanish a while ago with the same idea (there is a dependency graph underlying the content). If you are curious: http://www.minidosis.org.
dovel 7 hours ago 0 replies      
I think this is fantastic. Obviously stuff to be ironed out, tidied up and better organised but genuinely think this is a great idea, how does one go about developing something like the dependency graph? Would love to implement it in a site I am building while learning Rails.
michaelochurch 13 hours ago 0 replies      
This is really cool. I wish I could give it more upvotes than the one I gave it.

Thanks for building something cool and useful. I love the idea and hope you develop it more.

cjrd 19 hours ago 0 replies      
pranavpiyush 15 hours ago 0 replies      
This deserves to exist and flourish. We built something similar (skilldom.org) but it hasn't yet deserved the love it should have... :) partly due to me not being disciplined enough to dedicate time to it.
danra 16 hours ago 0 replies      
Metacademy looks like a great source for learning, one which preserves most of the associativity benefit of learning through e.g. wikipedia, but which allows for much more focus.

Thank you so much for doing this!

Thirdegree 11 hours ago 0 replies      
This is awesome. I can see this, if it gets larger, being a ton of help throughout college.
v_paidi 11 hours ago 0 replies      
Nice way to organize major sources for learning Machine learning topics.
rajeevk 20 hours ago 1 reply      
The site is not accessible (not responding to http requests
known 11 hours ago 0 replies      
Ask HN: Idea Sunday
216 points by karangoeluw  1 day ago   464 comments top 102
avalaunch 21 hours ago 11 replies      
A major pain point in my life is house maintenance and repair. I really miss having a (good) landlord. That's what I want - a single point of contact to manage the upkeep of and fix any issues related to my home.

There's so much friction in the process of finding the various contractors needed to keep a home in good order. First you have to search on craigslist, angieslist, or google for contractors that service your area. Then you play phone tag with each. Then you schedule a time for each to come out and give an estimate, which can be a major interruption to your week. Then you schedule a time for the chosen contractor to actually complete the work. And then you cross your fingers that you chose well, because if you didn't, you'll have an even bigger headache on your hands.

Ideally I would pay a large monthly fee (500-1k) and absolutely everything would be covered. Regular maintenance would simply get done without requiring anything of me. My lawn would get cut when it needs it. My driveway would be cleared when it snows. My gutters would be cleaned as needed. My home would be cleaned twice a month. And so on. When ever anything else needs fixing, I'd have a single point of contact (an app, maybe) where I could open a ticket. I'd then be offered a selection of times when an expert could come fix the issue and after selecting a time that works for me, an expert would actually show up at that time and fix the problem. A little more friction could be removed if I could preselect times when it's acceptable for maintenance personnel to enter my home. Ideally I wouldn't even have to be home. The service would also have permission to deal with my insurance company as needed since that's also a major pain point. They'd cover anything not covered by the insurance. Or perhaps I could do away with my existing home insurance in favor of this full service home insurance company.

To begin the service, someone would have to perform a full home inspection to uncover preexisting issues which wouldn't be covered. The service could help take care of those issues but it'd have to be on an a la carte basis. Once the home was up to snuff, then the monthly fee would kick in and cover any new issues, as well as regular maintenance.

A simpler version of the idea, which wouldn't be as good but would have a lot less risk would be to offer maintenance only: lawn cutting, regular cleanings, ect... I'd still pay good money for that.

elemos 2 minutes ago 0 replies      
A safe storage locker. I travel frequently for work between a few different cities and it's a pain to constantly have to pack and re-pack, and buy and re-buy those things which I cannot pack.

I just want a locker where I can feel safe, both for the belongs physically, but also legally safe like the rights you get when you own or rent property, leaving my belongings when I'm not in town.

miguelrochefort 22 hours ago 7 replies      
Why are recipes linear and textual? I'm surprised the recipe-book metaphor sticked with us for that long.

I want a social cooking platform where the only way to represent a recipe is with a diagram.


You're a cooking master? No need to explain you how to make a roux or how to blanch vegetables.

Don't have butter? We'll substitute the step where you need butter by the steps to make it.

Allergic to peanut? These nuts are a good alternative.

Let's build a semantic recipe platform that's not linear and add a functional twist to ingredients (the part where you can substitute an ingredient by the function that returns one).

mden 22 hours ago 8 replies      
Idea: Tree of knowledge

Ever been interested in a topic that once you google you end up with explanations(quite often on wikipedia) that rely on foundational knowledge you didn't even know you should have? And then you started working your way back by googling things you didn't know until you hit something you do know and from that point you try to inch your way forward to the original topic just to get discouraged a few hours in? I know I have and it's a pain!

A well built knowledge map that would graph the relationships between different topics in a field would help alleviate this problem. Take for example linear algebra. You've heard about this fancy thing called singular value decomposition but barely know what a matrix is. You type SVD into a search box, and it generates a breadth first tree with all the topics you need to know to be able to understand SVD up to a certain depth. And then you just work from the leafs that you do understand up to the topic you are interested. This saves hours or sometimes days of just trying to understand the ordering in which you should be learning things. It essentially builds a curriculum for the user on the fly for a topic they are interested.

I would propose this as a community wiki so knowledge maps could be crowdsourced and curated as they would be time consuming and difficult to build for a single person. Would also suggest adding the ability to let users create accounts and mark off topics they feel confident they know.

Potential problems: The two big problems with this idea are 1) generating a proper knowledge map: There will be ambiguities in the edges and even the nodes of a map. Sometimes (often) you will need to be clever how you organize the information. For example, your have a dependency listing like: Matrix <- Rotation Matrix, but in reality it might be better to have something like Matrix <- Linear Transform <- Rotation Matrix. Linear transforms would act as an intermediary node for rotation, scaling, shearing, w/e.

2) a topics can be studied in different frameworks: E.g. linear algebra can be studied with or without using vector spaces. Once again, deciding how to create the knowledge graph will be difficult.

Solution: Have multiple types of edges. You can have edges to signify hard dependencies, soft dependencies, generalizations, and extensions. Maybe other types of edges. You will still need to be clever, but having a way to signify the relationship between topics will help resolve the problem.

manish_gill 22 hours ago 11 replies      
This might be fairly simple, but I can't find a good solution - A replacement for Google Groups.

More specifically, a better UI to use Mailing Lists. Perhaps like vBulletin or other advance forum software. Maybe even built-in support in my email client?

For the life of me, I can't find a good way to use Mailing Lists. I don't like receiving 40+ messages every day, but I don't find digest mode good enough either. Google Groups is clunky, and it gives me no good motivation to return to it. The readability is also not all that great imo. The whole ajaxy thing it has going for it is also bad. I want to read static text on a functional and beautiful UI. It's not too much to ask for. :(

egypturnash 21 hours ago 4 replies      
Basically Yelp for transgender surgeons.

I've been working on making the decision as to who I'll get to sculpt new genitals for me, and researching this on the web is a mess - every site comparing them is out of date, triggers my mental sketchy spam site detectors, or both.

It'd be great to be able to go to a nice-looking site and say "all I'm interested in right now is MtF genital surgery", then see doctors who do that, and crowd-sourced reviews if their work. (Other people may be interested in FtM genital surgery, breast augumentation/removal, orichectomy, you get the idea - various manipulations of genitals and secondary gender cues.)

I think there's probably less than a hundred people who offer these kinds of services in the world, so it's not exactly a huge database to worry about.

personlurking 23 hours ago 12 replies      
Just going to throw this one out there. I live in Lisbon, it's tourist-central (I'm from SF and I've never seen so many tourists). Since I see a ton of lost people every single day there should be a way to digitally leave comments on things and places and a free one-stop shop to find such info (like Wikitravel). This info, although having a central repository, should be pushed out to an app that connects to one's phone (in particular, GPS) so that when you need help with figuring out where you are, what statue you're standing in front of, etc, you can open the app and it tell you (no entering anything...only if you want to get to another location).

By entering what you want to do beforehand, the app would know where you are and have a list of places you said you want to go, and tell you how to get to the next closest place, or alert you if one on your list is about to close for the day. Perhaps each city version is done by locals and in case of bad actors, there can be a voting system so the right info goes to the top. Plus, there could be integration with Google Maps so you can see if you're going the right way.

ezl 1 hour ago 1 reply      
Better calorie counting.

My fitness pal is great, but it too long for me to use, I suck at looking up or even knowing exactly what I ate is.

What I'd like is an app that lets me just take a picture of my meal, then add any comments that might be helpful "ham sandwich" or "poached salmon, about the size of my palm" -- then it magically figures out what I ate and the nutrition contents (within a day is fine)

I've seen this attempted before, but nobody still seems to be in business. I suspect this has to happen with a human powered "pictures+comments to nutrition data" engine.

I'd be willing to pay $30/mo (too low? $50?) for this and commit to paying for 4 months. If 99 other people (or 59) people committed as well, that'd be $3000/mo revenue for the day this thing launches.

1. Would anyone else back that as a pre-signed up user?2. Is anyone willing to do it for us?

basicallydan 21 hours ago 5 replies      
I've been sitting on this gem for a while. I present to you: The Bruce Wayne Gap Year.

Wealthy people with desires to become Batman can more-or-less do so.

First they sign a waiver and an NDA, removing any liability from the Bruce Wayne Gap Year company. Then they'll pay the company tens of thousands of dollars to pay for what is to come.

They'll be put into a real criminal gang [1], and taken around the world getting involved in all sorts of illegal [2] activities.

Sooner or later they'll be subtly led to the Himalayas, where they'll join a monastery, lead a simple life of celibacy and minimalism and slowly learn to meditate and fully understand themselves and their body.

After a while, they'll be groomed by a man [3] claiming to be working for a mysterious and powerful leader of a guild of assassins, and taught all kinds of martial arts over months and months, culminating in a complex battle which determines their eligibility. At that point, they will be asked to do something their morals will not allow (this will be determined in a psychological screening), and end up betraying and destroying [4] the guild.

Then they return home, better for the experience.

It can't fail. A friend of mine also suggested it be re-implemented for all sorts of action hero/film type situations. James Bond, Die Hard, Rambo, etc. It's essentially a very expensive, realistic roleplaying experience.

[1]: Actually, very highly paid and well-trained actors. We don't tell them that though.

[2]: Mostly not illegal, but they're made to believe that these things are illegal. Some things will be borderline (they may accidentally end up threatening people who are not part of the ruse, for example), hence the NDA.

[3]: Also an actor. A very good, very well paid actor. Possibly we'll just get Liam Neeson, and he'll act so well that he'll convince them that he's not Liam Neeson.

[4]: Not really. The martial artists will never be allowed to be worse than the client, and will also be stunt-trained and capable of faking death.

harryh 21 hours ago 7 replies      
Build a company around creating a best in class development environment that they can sell to other tech companies. This would involve everything from repository management (on top of git) to build & compilation tooling to automated testing and probably more than that eventually.

Once companies reach a certain scale they inevitably expend some of their resources on building internal development tools. At Foursquare we have 1 person (on a team of ~80) doing this fulltime. Google has spent a ton of effort on this with blaze. Facebook & Twitter have done similar work. But it's all fragmented and it's all reinventing the wheel.

A company should do this right for everyone. If it was good enough I'd happily write very very large checks to use it.

Honestly I think this is what GitHub should be doing, but they don't appear to have their shit together enough to innovate so someone else should do it.

raldi 23 hours ago 1 reply      
Unlike "Who is hiring?" posts, which are ephemeral, "Idea Sunday" posts continue being useful for a long time. Therefore, consider posting a link at the end of each one allowing readers to jump to the previous one.
miguelrochefort 23 hours ago 9 replies      
An IDE for ideas. Intellisense for thoughts.

For those of you who develop using powerful IDEs (such as Visual Studio, Eclipse, ...), it's hard to imagine going back to a basic notepad.

Most people, most of the time, don't write software. They exchange ideas, express wishes, share their feelings. And to do that, they use tools that are not more powerful than a basic notepad.

This forces them to be explicit, to explain what they mean, to repeat ideas, to think linearly.

I believe it's time for the average person to have access to tools that are just as expressive (if not more) than the ones developers have been using for years. It's time to break the speech metaphor and develop a completely new way to communicate. It's time for a UI-driven, computer-assisted, general-purpose language.

What I suggest we build is an IDE for ideas. Intellisense for thoughts.

wting 22 hours ago 2 replies      
There's a lot of parallel conversations on Reddit / HN / etc. for various articles.

Would about pulling high rated, top-level conversations from multiple sources for a quick digest? Sort of like Google News for commentary.

rdl 19 hours ago 1 reply      
A tablet (iPad or Android) game which is designed to be used while exercising.

Use ANT or Bluetooth 4.0LE to tie into a treadmill, bike, or other exercise equipment to get output measurements (speed); ideally, find some devices which allow two way commands (not common at all right now).

Networked games against other people, or vs. computer or past personal performance. The interesting part is a "use while running" interface for the touchscreen, requiring inputs (using gross motor skills, not fine control) to do things in-game while retaining performance. Or maybe use audio output for instructions (i.e. "press the blue button, then the red, then the green" while keeping heart BPM above 130, and targets moving on screen.

miguelrochefort 22 hours ago 7 replies      
Sell anything in a snap.

1. Find something you want to sell

2. Snap a picture (or a short video)

3. Tap "list for sale"

4. Let mechanical turk + computer vision identify the object

5. Let the system pick a value (based on sales history, location, demand)

6. Contact the seller when a serious buyer made a deposit

7. Proceed to demo + sale

I shouldn't have to write down any spec when selling something as ubiquitous as an Xbox 360. I shouldn't have to go through 100 different ways to describe an iPhone 4S when looking to buy one.

Delegating item identification to a third party is how you reduce the friction of listing items for sale and improve semantics.

And to think that this system only applies to selling items is naive. The possibilities are endless.

monk_e_boy 23 hours ago 8 replies      
Web filter. I love F1 and there is a race today, so I can't look at 99% of the internet as they will show the result. I will watch the race later when the kids have gone to bed.

This problem is so big that i have to avoid facebook becaue they also show trending news.

So a filter that filters F1 or any selectable sports news. Then when i turn it off after watching the race the filter shows me a list of what news it found and filtered for me.

Added extra, while i'm watching the race it could show me tweets in real time, but back shifted so as to make sense with the race.

My football loving buddie also agrees he'd pay for this filter.

rhythmvs 22 hours ago 5 replies      
A file naming convention (lightweight markup) that would allow us to store structured (meta)data right inside file names. Obviously inspired by Markdown and CSV.

We could then build lean, database-less asset management applications, while the user data (i.e. the files and their metadata) would always be portable, across platforms.

Take for example:

  J.M.W. Turner | Rain, Steam and Speed |  1844.jpg  W. Blake  | Newton  | 17951805.jpg
as compared to the clutter we now must deal with:

  _IMG00123.JPG  Turner_-_Rain%2C_Steam_and_Speed_-_National_Gallery_file.jpg
My practical use case: take snapshots of my incoming receipts, bills, etc., name the jpgs using the proposed file naming convention (including fields for VAT, net amount, etc.), put them in Dropbox, build a parser and accompanying GUI to edit file names (and their corresponding metadata; have total amounts etc. being calculated in real time), drop a link to that (web app) interface to my accountant.

Its just an idea for HN Idea Sunday; I did a somewhat more detailed write-up:


Bootvis 1 day ago 2 replies      
A bit lame maybe but here it goes:

The "Ready for Battle alarm clock". An alarm clock that wakes you up with your favourite quotes from video games or movies such as:

- Rise and shine, Mister Freeman. Rise and... shine. Not that I... wish to imply you have been sleeping on the job. No one is more deserving of a rest, and all the effort in the world would have gone to waste until... well, let's just say your hour has... come again. (or part of this one).

- It's time to kick ass and chew bubble gum... and I'm all outta gum

This idea would work best when you always wear your Google Glass like device. Then the audio can be combined with a nice visual of for instance the G-Man.

For now, without the glass integration, it's easy to do make this with your own phone. A nice service could be to personalize the message, i.e. "Wake up Mr. Bootvis...".

The big problem here is that just copying these audio samples isn't allowed and so it will be hard to build a company out of this.

DanBC 17 hours ago 0 replies      
== Dying Skills, Lost Tech ==

I knew a chap who could roof a home with Cotswold stone. He knew how the stone was quarried (but he didn't do that bit) and how it was made into roof tiles (but he didn't do that bit very often) and he knew how to roof a house using those tiles. There are not that many people who can do that anymore.

There's a meme about the NASA Saturn V rockets that says we've lost the paperwork and thus re-making them would be cery hard, and could involve rediscovering technology.

The Domesday project is sometimes used as an example of digital obselesance http://en.m.wikipedia.org/wiki/BBC_Domesday_Project

And here's an example of someone looking for Cray software and code and documentation https://news.ycombinator.com/item?id=3464546

So, this would be a site that interviews people (using crowdsourced interviews) to glean information about how they do or did things, and why, with video if possible of them demonstrating the techniques and equipment and methods.

This would be a teeny bit like the Endangered Language Project. http://www.endangeredlanguages.com/about/

There would need to be some way to control for truth and accuracy[1] and also some suggestions for what is a good interview.

[1] my dad used to tell quite a few lies. One of these (well, one set) was about his diabetes diagnosis and treatment. He claimed he had been diagnosed as a child while still at school, and that he had to sharpen his syringe on the stone floor. Utter cobblers, but somehow it found its way to an academic site. I sent them a polite email and they made their disclaimers about uncorroborated etc a bit clearer.

binarymax 18 hours ago 1 reply      
Food Golf.

(Code golf - for recipes). A community where members submit recipes. Score is a function of ingredient count and cooking time. The lower the better. Recipes are also rated for taste/quality by the community.

miguelrochefort 22 hours ago 1 reply      
A system to compose and track drug/supplement stacks, regimen and diets.

Transforming yourself has never been more accessible than it is today. We have access to so much information and so many resources that there's rarely any valid excuse not to become what you want to be. The problem is that the whole process can be overwhelming, and finding what works for you requires discipline and dedication. Most results don't happen overnight, and the only way to get through months of imperceptible progress is to have a clear plan, track everything, and learn from others.

I've seen all kinds of people in all kinds of context attempting to share some regimen with others. You will find that on Oprah, in books, at the gym, at your doctors, on forums, etc. That's all fine, but why do they still have to write down the name of the product, the brand, the posology, the side-effects, their interactions, everything by hand? Wouldn't it be easier for them to write them down with a tool that understands what the regimen means, and easier for us to add them to our own regimen in a single click?

Once the system understands what I (and others) want to achieve, how I progress and exactly what I do to reach it, only good things can come out of it. It can learn (machine learning, correlation finding), it can recommend tweaks, it can help me acquire products, it can reward me, etc.

How hard is it to set-up a database of all drugs/supplements/vitamins, and let people semantically fill the why, what, when and how?

I don't think there's a lack of niches either:

- Bodybuilding

- Cognitive enhancement (nootropics)

- Weight loss

- Hair loss

- Skin care

- Allergies

- Acne

- Long distance running

- Diets (vegan, paleo)

- Chronic disease

- Life extension

sambeau 23 hours ago 8 replies      
A meetings clock that counts UP the price of a meeting based on the the salaries of the people in the room.
keithwarren 23 hours ago 3 replies      
RAID Arrays for online storage services.

So many services offer a free tier, maybe 5gb for free and then you pay after that. Some are much higher. Build sort of a proxy to these services so that you have a distributed and large free online storage system.

This was an idea my team had last year when we were looking closely at a photo organization and management startup. We had won a startup competition, had investors tender offers but in the end we decided not to pursue the idea primarily because the storage business absolutely sucks, and photo systems are inherently storage businesses. This idea of 'BYOS' (Bring your own storage) was one of the hacks we thought up to get around the problem but in the end customer discovery taught us that the idea had too much friction for most people. Tech folks loved it, 35 year old moms didn't.

You can simply start with a few of the larger players, use the service to connect your free DropBox, Google Drive and OneDrive accounts. There may even be a monetization option wherein as you approach saturation of the storage you push the user to sign up with a specific vendor for a discounted deal and that other vendor can be a partner company or your own storage medium.

It has to be simple and transparent though, you still want people to have that simply sync experience regardless of where the file is stored and they should be able to view all the files across all the services at one time, regardless of where they are physically stored.

11thEarlOfMar 20 hours ago 3 replies      
I frequently wonder if there is a place on the planet I'd be happier in. The idea is a site that allows me to select a wide variety of attributes and then search for places in the world that match those attributes. For attributes that I only care about only generally, I'd be able to select from a broad category. For those that I care a lot about, I'd be able to drill in to highly granular selections.

For example, I may only care that: : Government = Democracy

But for climate, something a little more specific: : Climate : Rainfall < 200 cm : Climate : Snowfall = 0 cm

And then something really granular: : Sports & Leisure : Adventure Sports : Sky Diving < 50 km

Major categories might include:Climate GeographyDemographicsGovernmentInfrastructureSecurityEntertainmentRecreationCultureEducationEconomy

It could be marketed as a branded plugin for company web sites in travel, real estate and jobs. They'd pay for clicks and then use the results to market their services.

I've found sites that offer this, but none have been quite what I wanted. One requires you to enter the locations you think you'd like and then helps you decide. Another was pretty close, but only covered the USA.

marpalmin 23 hours ago 1 reply      
A kind of task rabbit that will connect expats ( who don't speak the language) with locals. The idea is that the local will help the expat in small tasks like understanding an insurance policy, housing contract, employment receipt.
huherto 15 hours ago 0 replies      
A corporate firewall where employees can control which sites they want to stay away from.

I don't know if this exists. But some companies don't ban some sites because they don't want to appear to be too controlling.

I personally do this by changing my own /etc/hosts file, but it is too easy to override. A firewall solution my be better for technical and not technical employees to help them to control their own browsing addictions.

rokhayakebe 22 hours ago 4 replies      
I compiled different ideas over the week.

coMarketing: I would like to see a solution that allows smaller companies that target a similar audience to be able to put their "little" marketing dollar together and have a better chance at fighting the big guy.

Phone screen on my computer: When I am at my desk, I want to be able to use my phone from my computer. I can use the desktop interface to go through contacts, check feeds, answer texts, and even forward calls to my desk phone.

Conversation blog: A blogging platform that is based on discussion. Each blog post is a conversation between two or more people, let us stop the monologues because in real life to hear two is more interesting than one.

Alarm Band: A simple band that wakes me up with a buzz, but I just want to spend $25 for it.

PHP wrapper: I began to write a simple consistency wrapper for PHP, but I never finished. Basically it a class that gives me a clear structure on how to pass parameters for functions. I ALWAYS know to do function something (haystack, needle) then the wrapper rearranges according to the actual function requirement.

rjf1990 23 hours ago 3 replies      
Airbnb, but paid with labor.

Many travelers are short on cash but would love to trade services for a night's stay. Me, I would be happy to host a guest for free provided they did my dishes or laundry.

Many homeowners, especially empty-nesters, have homes with plenty of space that they still have to maintain. This would provide benefits to both parties.

yzhou 21 hours ago 4 replies      
Here's what I want: A cheap text ssh terminal with wifi,or cellular, nice keyboard hardware, with extremely long battery life (or solar powered), which i can just throw it in my car and forget it. Whenever I am away of my computer I can always log in to my cloud server and write codes or do some quick fixes.
gus_massa 23 hours ago 1 reply      
Current Idea Sunday thread https://news.ycombinator.com/item?id=7616132 14 points, 7 hours ago, 22 comments
pubby 23 hours ago 4 replies      
A twitter/imageboard system where it takes 2 weeks for messages to appear once posted. The idea being that messages still relevant in 2 weeks are important and interesting ones.
pnathan 21 hours ago 2 replies      
Chores service.

For $X, we will come and do Y chores, quickly and professionally, relatively flat rate. Y job is a typical household chore.

Particular pain point: cleaning the litterbox. I don't really like it, so it gets delayed a bit more than it should. Garbage can be a pain when the apartment building is poorly laid out.

I don't mean a maid or cleaning service. If I lived in a house, I'd want someone for random house maintenance tasks.

I'm half-tempted to do this where I live - I live near both a university and a fairly well-off suburb. Pretty sure a freshman would appreciate odd-job work not far from campus.

The catch is that I don't really have time to deal with bonding, insurance, payroll, workman's comp, etc, etc. Someone with 10-20K, familiarity with the process, lots of flex time over the next 4-6 months (when your help gets sick, YOU get to do it. =) ), and a yen for business could probably make a tidy income from it.

fest 20 hours ago 4 replies      
Ordering prototype (3d printing, CNC machining, lasercutting) parts, reimagined. It's a major pain just to get a few parts made- first you find a company that does that, e-mail them their design, they get back to you with a quote and you either accept it or go back to step 1.

What if you could just upload your design, select a material from catalogue and receive an instant quote. If you're happy with it, order the parts and then either pick them up in person or have them delivered.

s369610 8 hours ago 0 replies      
Idea: ShowMeThere (needs better name, camanywhere?)

An phone/web app that allows you to click on a place in google maps and request a "cammer" for x minutes, you pay $x and anyone running the software on their phone that is near that location can accept the offer and start streaming video from their phone to you, allowing you to have a kind of "live" street view.

There is UI for the payer to click on arrows to inform the cammer to move here or there, and to zoom or focus on certain things, also a pre translated set of things to help communicate with cammer. Cammer gets paid after the x minutes is up.

Cammer gets cheap money for being a personal camera man for someone somewhere else in the world.

Client saves a trip out there to see something for himself.

Use cases: Want to see if an antique you are looking for is at the markets but cant get away from work?.Want to check out markets in turkey but live in australia? Want to see what the surf is "really" like right now and whether you should bother heading out? Police work/chases! heaps of uses.

For popular events and markets, a cammer could setup shop and offer high quality streams etc.

hershel 17 hours ago 0 replies      
One of the biggest problem in the new field of medical apps is lack of verification: how do i as a doctor or patient know if an app works, and how well ?

Of course this problem is generic for many types of products.Reviews are a partial solution ,but it would be quite useful to have a site that gathers research and helps in creating valid research on various products.

gamegoblin 21 hours ago 1 reply      
I really hate the format of arguments/debates in person. I feel like they would be a lot more rational if done through email or something, where one isn't expected to respond immediately as part of a conversation.

I'd like a website which facilitated debates, allowed debaters to branch off multiple threads of debate and close threads once they are settled (ideally until all threads are closed), backreference other threads, citation lists that get automatically aggregated, etc.

Ryel 12 hours ago 0 replies      
Full text search of Youtube videos.

How many times have you watched a video, maybe an hour long conference and wanted to go back to a point in the video where something specific was mentioned. You have to skim to the area where you think it happened, and then re-watch the video until you find what you're looking for. That sucks.

Even if Youtube allowed people to embed the full transcript of a video in some kind of search layer with corresponding points in the video that would be a dream.

Besides that I'd like a web browser built on 'the hive' that eliminates advertising. If you want to browse the web without being tracked, traced, prodded and harassed you could install the browser and a small, unnoticeable amount of your CPU is used to help power the hive. I would then like to see that operating system give a charitable donation every 6 months that says they will donate 1 month of computational power to a research organization.

One thing I've always wanted to create is a competition website like Kaggle but with real world results. Let's say I create this and go to Mount Sinai hospital in NYC and tell them that I want to run a contest. The contest is to anybody who thinks they can increase Mount Sinai's margins by X percentage within a particular area of interest. The agreement with Mount Sinai is that they have to give us all of the available data and we will open-source it. The challenge for open-source folks is to review the data and if you can find a way to increase margins by X percentage while still upholding the same level of quality, you win the prize. Let's say Mount Sinai gives us access to the entire spreadsheet of products they order from hospital gowns to radiation machines, or even lets say we fund a $100,000 bounty to anybody or any team who can reduce their emergency room wait times by 50%.

You could solve the problem in any way possible, improving hardware/software, or simply finding redundancy in logistics.

Hospitals are probably not the best examples but I hope you see what I'm getting at...

adamzerner 14 hours ago 0 replies      
Skimmable video. Like this - http://worrydream.com/MediaForThinkingTheUnthinkable/

Like any platform, there's the chicken-egg problem. I'm not sure how you'd get enough content, but I'm confident that if you had enough content, it'd be better than YouTube.

But starting off getting content wouldn't be that hard though. You could take currently existing YouTube videos and make them skimmable. And you could convince people to make videos for your site because the quality is so high.

And if you could create a tool that makes making these videos easy, it'd provide single-user utility, is important for platforms. See http://platformed.info/

gbrits 18 hours ago 1 reply      
Related to @mden's Idea: Tree of knowledge.

Chrono: chronological inventions and academic breakthroughs of mankind as a dependency graph. This is a lingering idea that has been coming back to me a couple times a year over the last decade or so.

What if there's a kind of semantic wikipedia that is built upon a dependency graph of inventions and academic breakthroughs. What led to the invention of the internet, to nano-tubes, etc? How cool would it be from an education standpoint to be able to jump back in time and see invention upon invention replayed (with backgrounds on how these breakthroughs came to be) up to today.

Check out what led to invention X (the galaxy S you're reading this on), played back . Or reversely, lookup which inventions were build (transitively) upon the discovery of Y. You'd also finally be able to answer definitively who was more important: Tesla or Edison ;)

Socio-economic backgrounds, anecdotes, etc. what led to invention X, and how X was important for Y, etc. An interactive "Short history of nearly everything"

nathanathan 23 hours ago 2 replies      
[X-post from the previous Idea Sunday thread that didn't make it to the front page: https://news.ycombinator.com/item?id=7616132]

Idea: Git-story, a website that generates summary narratives from git commit histories and other github data.

Here's a brain-dump with some ideas for the specifics:

Use foreshadowing: "It all started with one person, X, spending months to gradually build what would one day become Y, a project forked by hundreds and starred by thousands."

When someone makes their first contribution to a project give them a brief introduction, like a shorter version of http://osrc.dfm.io/

Use sentiment analysis on commit messages to say things like "Frustrations mount as...", "the developers rejoice after..."

When people work on multiple concurrent branches use use phrases like: Meanwhile, X and Y toil away on the new Z feature.

Use the time between commits to chunk them into single sentences/paragraphs. Also, add comments if the project goes dormant, or if there is a spike in development.

Use keywords in commit messages like merge, revert, resolve to generate events in the story.

When bugs are resolved look for linked issues and use the age of the bug and number of comments to say thinks like "X finally fixed the controversial Y bug"

karangoeluw 1 day ago 10 replies      
Ok I'll go first.

It's an "Imgur for audio files".

Now there's times when you record an audio and want to share it. What do you do? Uhh,, umm.... Yup. exactly. There's no reliable, easy-to-use app to share audio files (not music).

So, this is a web/mobile app for easily uploading and sharing audio files, and playing them. I don't have a full plan laid out, but I'll work on it for sure.

(If you'd like to be notified when it's done, let me know: http://eepurl.com/SRIPT)

neilsharma 19 hours ago 3 replies      
Problem: Completion rates for online courses are dismal and engagement with other students and faculty is low.

Idea: Weekly online, live discussion sections to accompany self-paced video lectures. Discussion sections have 5-10 students and are facilitated by Teaching Assistants

How it works:

Students taking a MOOC course sign up each week for a discussion section. There can be multiple discussion sections to accommodate changes in the student's schedule, different time zones, etc. Students pay ~$10/discussion, once a week, for an hour.

Discussion sections can be G+ Hangout style and taught by crowdsourced Teaching Assistants. These TAs can be grad students in universities looking to make extra money. They can assist students with HW problems, go over tough concepts, and talk about material outside of the immediate subject matter.

TAs can rev share per discussion. Example: 30% rev share for a class of eight students paying $10 each --> $24 for the TA for an hour of teaching. This is significantly higher than market rate (~$10-16/hr)

siruva07 1 day ago 6 replies      
I'm responsible for purchasing our computer equipment for our startup. Every time I make a purchase on our company (@MakeSpace) credit card, I have to remember to send an email to our accountants (record the purchase as an asset, depreciation for taxes, etc).

More importantly, it's hard to keep track of what equipment was given to each employee. I imagine at a larger company this would be handled by an IT department, but sub 50 people I'm doing this on a spreadsheet myself. Would love a simple web app to record serial # of machine, receipt (that I could upload PDF), date purchased, employee, etc.

Happily would pay monthly SaaS. Please message me if anyone knows of this type of product. I'd happily be your first customer if you want to build it.

edit: Happy Easter!

PopcornTimer 13 hours ago 1 reply      
Mixergy specifically for bootstrapped solopreneurs/micropreneurs except NO video interview or a lot of the unnecessary info.

I want something I can read (comprehensive guides) with specific background info on insider knowledge per industry or cool things they did that made the company a success.

Mixergy interviews are often too long, too much noise to sift through, and not enough core information.

Examples of things I've love to see... Say you are building a physical product... How did you go about figuring/developing the prototype. If its hardware, where did you go to get the prototype done (PCB boards, etc..). How did you figure out manufacturing, if its overseas, how did you coordinate (different language, sourcer, shipping information, quality control, etc...).

If you're in the food industry, how did you manage to get contacts to help you get your product into stores, how did you develop or manufacture the product, etc.. If you're in catering, how did you go about reaching specific clients before you were known, etc...

The things people want to know are the details that are difficult to find that could be useful. I don't care much for the stories (success or failure doesn't matter as much as the core background on how things got done).

Jemaclus 11 hours ago 2 replies      
Google Glass app that streams subtitles from Netflix, Hulu, or even cable TV. I rely heavily on subtitles to watch TV (I'm very hard-of-hearing), but I've found that my [hearing] friends don't like subtitles very much. It would make my life 10 million times easier if I could watch TV with subtitles or closed captioning without having to subject my friends to that as well.

I don't know jack shit about Google Glass, but this is so high on my list of wants that I'd drop $1500 for a Google Glass just for this app, assuming it worked well and as advertised.

Someone make this happen.

sakai 21 hours ago 3 replies      
A simple service for setting up and running jobs/workers without having to run a server.

Ideally, it would have the following features:

* Pay once, run forever (pay for the job up front and never again -- no recurring billing to worry about)

* Configure once, run forever (use Docker/LXC in the background to allow custom environments and absolve the user of the dependency headaches that can arise when running multiple jobs on a single machine)

* Easy to use

I've been casually working on this as it's a pain point I've experienced numerous times (e.g., running a daily job that should cost ~50 cents per month, which is substantially below any available VM price).

Would anybody use this? Other thoughts?

jreed91 18 hours ago 1 reply      
I just thought of this so I haven't exactly vetted it out yet or checked if anyone else is doing this.

A major pain I've noticed is event planning. If you are doing it solo you must call a ton of different individuals. You have to call for a place for the event, a caterer or someone to supply food, decorations, and finally invite all the people you want to come.

My idea is kind of a cross between airbnb and eventbrite. We create a service that works with only local locations and food vendors. This service allows them to post times they are available and food availability in a central location. Individuals who are planning an event can come to this web site and select what they want and where they want there event.

This service acts as the middleman easing the pain of event planning in a one stop web site and also benefits the local small businesses by connecting them with people looking for vendors for an event.

Let me know if anyone sees any problems with this or ways it could be improved.

realrocker 21 hours ago 0 replies      
A mobile app to gamify recruitment. Users can win small amounts of credit by 1) referring friends(from phonebook) for a job 2) reviewing jobs to put them in the baskets of: not applicable, interested but not actively looking, interested and applying.This credit can be redeemed for non-cash items like: gift cards and coupons. Recruiters pay a fee(in-app payment) to post jobs. In return they get: 1) leads to candidates who applied and 2) profile summary of people who found the job interesting but did not apply or found it not applicable. For e.g: 10 applied(list of contact info), 15 found the job interesting but did not apply(of which: 70 % are web developers, 60 % have salaries more than 100k, 80 % are more than 5 years of exp. I have been working on a prototype but it still need to iron out a few wrinkles idea-wise.
frankpinto 13 hours ago 2 replies      
A site where you sign up for coffee / tea / lunch with a stranger in your workplaces area.


- I'm currently in Guatemala City and nobody talks here. A lot of people go to their office, bring their lunch, go home. You hang out with your high school / college buddies some evenings / weekends. I should know who works in the building next to mine.

- Haven't read the book but the concept of "Never Eat Alone" has been running through my head for a bit

- http://www.inc.com/ilan-mochari/6-habits-connectors.html?cid...




vishaldpatel 23 hours ago 2 replies      
Here's one I've had for a little while:Say you're playing a sport with your friends. You're yelling at each other.. commands likes, "pass the ball here". Or if you're out playing paintball and trying to coordinate an attack - your sound, and your opponent's sound are both pretty important.

As far as I know, such use of voice does not exist in any game. Player's voice does not really interact with the environment. So, say someone says, "come to me!" through voice-chat.. you still have to look at the map to see where they actually are.

cabalamat 16 hours ago 2 replies      
Something like Wordpress but including a wiki as well as a group blog. The blog entries would also be viewable based on recentness/score (like on reddit or HN) or by topic (like on forum software).

It would be possible to have a local mirror of the site on one's PC which would automatically sync with the live site; this mirror could also be used to set up other live sites. Thgis would be an anti-censorship measure if the site went down, someonre else could mirror it easily.

refrigerator 23 hours ago 3 replies      
[X-post from the previous Idea Sunday thread that didn't make it to the front page: https://news.ycombinator.com/item?id=7616132]

Idea: a new way to purchase and set up a fish tank. Currently, you have two options:

a) Buy everything separately - tank, filter, heater, plants, fish. You have to find out whether your fish and plants are compatible and your tank is big enough for what you want etc. You have to figure out where to put the heater and filter so that it doesn't look unsightly.

b) Buy a prebuilt tank with the filter and heater etc. pre-installed somewhere not too ugly. You still have to figure out which livestock you can keep, based on tank size, filter type, plants, and whether they can live with the other livestock you want. You also have to live with the prebuilt tank company's design decisions, which you might not like.

The solution: a company that offers minimalistic, sleek tanks with a modular system for adding filters, heaters, skimmers, lighting, etc. that keeps the equipment out of the way and not looking ugly. Also, an online service where they can select and order the modular tank and equipment that they want, and be allowed to choose from compatible livestock and plants. Alternatively, they can start with livestock that they want and they can be recommended the right modular tank and equipment etc. They can pay for everything all together and the items would be delivered as they are needed (with marine tanks, for example, you have to let the tank 'cycle' for a few weeks so that the water parameters normalise before you can add livestock).

What do you guys think?

fiatjaf 15 hours ago 0 replies      
A platform for building websites with data you already have (or will create/update) and want to make public.

There are already a specialized niche for this: blogs. Blog platforms just take your data (posts) and display in a nice time-aware format.

But there are not alternatives for pages built with not temporal articles, tree-structured data, hierarchical content, lists of things etc.

kidlogic 10 hours ago 0 replies      
Yelp for Manufacturing - help hardware startups determine which manufacturer fits their needs and remove the question of whether or not they're working with someone qualified
martinaglv 20 hours ago 1 reply      
tl;dr Independent store for bookmarkable HTML5 mobile apps

This is an idea that I had recently, but for which unfortunately I have no time. I hope that some of you folks can make it happen.

The idea is that with the recent release of Chrome for Android's "Add to Home Screen" feature, there is now a way to bookmark websites to the home screen of every mobile os. Mobile sites can add a meta tag to hide the browser chrome and look fully native. Combined with fast mobile processors, this means that we can finally have native-like experience only by using HTML.

It may be difficult to build a business around it, and could make more sense if it is crowd-sourced (the database could be hosted as json on a github repo).

I haven't done much research, but I believe that an independent store which collects these apps, makes them discoverable, and instructs people how to install them would be very useful, and will do a great job for promoting the freedom of the web over closed app stores.

bliker 21 hours ago 5 replies      
I really want a nice wysiwyg markdown editor. Not the two column layout. I want syntax highlighting, but it should not only do colors but also semantics. Headings should be actually bigger and italics slanted.

I tried and failed to make this idea a reality. I got to partial solution using regular expressions. But it is far from functional and reliable but it is only like 200 lines!

You can check it out here: https://dl.dropboxusercontent.com/u/52646091/syntax/selectio...

curlyreggie 5 hours ago 0 replies      
This might be trivial. One of my friends is a serious blogger (she has around 7-8 blog sites on various topics she writes). Why not a simple blog aggregator to post in blogs into the site you wish?

E.g., You want to blog for tumblr, create a post here and then it gets auto-posted to tumblr directly.

Of course, the API issues and permissions are another headache to worry about.

hershel 19 hours ago 1 reply      
1. Combining the mio water flavoring , which gets very good reviews in his category ,into a cap of a reusable water bottle ,similar to pillid[2]

2. Pcb layout design, in the electronics industry, is somewhat similar to games like pipes. So why not gamify pcb layout[3] ?

3. Recently i read some interesting research about some electronic circuit. It would have been very useful to get access to a full diagram/layout and a module to buy and play with.



Buetol 23 hours ago 4 replies      
An anonymous and representative group discussion and voting system

Practical example: Attending a conf as a woman

- You want to ask questions during the talks but you are afraid that because you are a woman your answer will be "dumbed-down" or just different

- Also, the guy doing the talk would like to answer the best possible question (or a random one)

So, there can be a lot of solutions to this problem, here is mine:

- Every attendee get an anonymous account on discuss.confname.org

- So everyone can ask questions anonymously and also it's fair because everyone has only one account

Except, that I made up this example in 5 minutes. This problem is effectively on every possible group in the world. People would like to express their opinions inside the group without risking differentiation.

I tried to describe this idea and the implementation ( http://kioto.io ), but it's really hard to explain. So I'm just implementing a prototype right now to better explain this idea.

genericsteele 23 hours ago 5 replies      
Idea: StackOverflow for comebacks

I've always had a hard time coming up with a good comeback in conversations. It would be great to have a site where I could post a situation and have the community suggest and upvote/downvote insults and comebacks. Maybe introduce a real-time element so I could use it in an actual conversation.

mudil 22 hours ago 2 replies      
WP plugin to charge for guest posts.

As a blogger for last 10 years, I think someone should consider providing WP blogs with ability to charge for guest posts. That can be done either as a single time purchase (for a single guest post) or as a recurring payment (i.e. monthly charge for unlimited or predetermined number of guest posts per month). The purchaser makes a payment and opens an account, and then is redirected to WP where he can write a post.

Guests posts is a growing industry for a number of reasons: SEO manipulation, old fashioned PR, promotional activities, contests with multiple participants (such as writing contests), etc etc.

rdl 23 hours ago 1 reply      
Free network (data, voice, mobile) monitoring in exchange for personalized and general/statistical data based recommendations (which plan to change, which ISPs in the area are best, etc.)
Lambdanaut 23 hours ago 2 replies      
Let's take 3D printing to the circuit board world.

A consumer machine that can be configured to takes as inputs:

1. A set of different electrical part. (Perhaps self-contained in a large box like printer ink is)

2. A circuit board schematic file

The machine cuts the board and solders the parts in.

And there you have it! Your own computer factory! (For limited definitions of "computer")

If this idea ever piques anyone's interest, I'd love to lend a hand with it.

LazerBear 21 hours ago 2 replies      
GitHub for mathematicians.

For every theory, define its axioms and valid logical steps. Let anyone build theorems based on these (validate them automatically), and allow people to fork others theorems to create their own.

It's probably possible to get a lot of proofs from projects like Mizar and Metamath to start with, then let the community build on top of it.

Maybe even a crowd sourced bounty program for unproven theorems, like P=NP. Let people pledge and automatically pay to whoever proves or disproves it.

I think this can really change how mathematical research is done.

joeclark77 17 hours ago 0 replies      
Crowdsourced gardening. You create an account, taking pictures of your garden. A cloud service stitches them together into a 3D model. GPS location and the height of the various house/trees/walls are used to predict sunlight and shade. Historical weather is recorded and predictions of future weather are built in. Optionally the user can use cheap sensors or manual test kits to measure soil contents, moisture, acidity etc.

Once the data is put in, the garden becomes part of the user's "profile" on the site. Others can examine it and make suggestions about what to plant, how to amend the soil, etc. User can log the things they do and upload photos (or use automated sensors and webcam to send periodic updates). You can "star" someone's garden to keep track of it and see how well their decisions worked.

Would be a great tool for experienced gardeners with too much time on their hands, and busy newbies (who are nerds like me) to get free access to distributed knowledge to learn how to grow food in the back yard.

cweiss 11 hours ago 1 reply      
A photo app designed specifically for concerts - The main feature is that the display goes dark while filming/snapping shots. As someone who goes to a lot of shows, I hate the myriad of tiny squares I see between myself and the stage and I've noticed a fairly small percentage of folks actually check focus/etc., they just stick their phones in the air and shoot away.

Bonus points for good social media integration - Read my location/4sq to know what show I'm at - Easy posting to 4sq/FB/Twitter/Compuserve.

slater 22 hours ago 3 replies      
A sortable, queryable list of movies. A site that would retrieve a resulting (and sortable!) list of queries such as "Show me all science-fiction movies made in the last ten years", or "All Arnold Schwarzenegger movies that have won an Oscar" (trick question!)

- Yes, IMDB exists. And has some of the functionality I'd like. But it is a slow site, replete with ads, upsells, 2003-esque "only show 10 results per page!", etc. Yes, I realize they have to make money.

- Yes, there are millions of movies in existence, and thousands added to the pile every year. Not sure how to fix that data issue :(

rokhayakebe 21 hours ago 0 replies      
Endorse Graph: A browser plugin that allows people who are signed in with their Linkedin, Github, Stack Overflow, Medical-Sites, etc... to endorse the content of a web page.
yesimahuman 22 hours ago 0 replies      
Registered agents as a service. Right now hiring people out of state is a pain, despite how popular remote working is getting. There are a lot of random rules in each state, and you have to have a registered agent in a state in order to hire in that one. There are a few services that do this, but they do it poorly and not in the typical startup fashion we've come to expect.
kordless 21 hours ago 0 replies      
A highly distributed framework for building a coop cloud using OpenStack and Bitcoin: https://github.com/StackMonkey. Whitepaper here: https://github.com/StackMonkey/xovio-pool/blob/master/whitep...

I'm working on this full time now, so it's less of an idea than a reality in progress! :)

miguelrochefort 21 hours ago 0 replies      
Semantic product review. Object-oriented product review if you will.

Refer to specific aspects of a product and make semantic statements about them.

Endorse what you agree with instead of repeating it.

Basically, no blank text box waiting to be filled witohut the reviewer's own unique way to format thoughts.

fiatjaf 15 hours ago 0 replies      
A tool for storing/sharing information inside private communities.

Enough of the forum/blog/posts/email solutions! How can a community of people, oriented to a subject or location, keep organized data about things it cares about?

notduncansmith 15 hours ago 1 reply      
One idea I saw posted last week that I'm scoping out and considering working on: Dataclips for everyone.

Heroku Dataclips allow you to share the results of a SQL query against your database with a simple URL. It'd be really cool as a standalone service that you could hook up to your non-Heroku DBs (local, QA, production, etc). An API would be sweet as well (imagine having a dataclip with your stack traces during QA).

caseyash2 22 hours ago 0 replies      
As a part of my venture into user interface and user experience design (http://www.caseyash.com), I created a few concepts. It would be excellent if I could implement them, however technical individuals are rare in Tennessee.

Below are the concepts:

Pack - A travel planning tool that integrates weather information to stay alert on what to pack.

RQRES - A real estate search that uses a collaborative algorithm to quickly find a home. The application pulls 10 homes; user rates them and then are shown results that are highly relevant.

Pattern - A more difficult game of Simon. Instead of four tiles, there are nine. The game also features ways to manipulate the game board.

Wonder - Hyper-local network.

ld00d 16 hours ago 1 reply      
A system of homeless charity. I never have cash. What if I could scan a QR code to donate? What if that donation was better distributed through a central agency instead of directly to the person on the street corner? The guy on the corner gets a bigger piece as an incentive.

A distributed peer to peer encrypted chat system. No dependencies on Google or whoever for hosting. No middle channel holding private keys. Threaded conversations. Synced across devices.

MicroBerto 15 hours ago 0 replies      
This might exist but Googling for it has been a wasteland.

I follow many social media feeds that are other companies in my industry - some partners, some competitors.

I want a script or app or service to dig through their social media history (pics, posts, etc) and send me what's been most engaging.

That's it. Can probably be done in iMacros, just haven't found the time... Could be a small SAAS though.

wmaiouiru 13 hours ago 0 replies      
A cloud processing platform that would automatically edit videos for users based on algorithms. My thesis is that users will take more and more videos but the tools to edit them (adobe premiere, youtube editor etc) are too much hassle for people. The first market segment would be goPro and google glass users. Thoughts?
dk8996 21 hours ago 0 replies      
Social analytics. The idea is that you can keep track of multiple facebook fan pages/twitter accounts and see how they grow over time. How many followers overtime, how many links, ect.. You may want to see other info like from what countries the likes, follows are from. Something very simple that just keeps track of your analytics and displays a nice chart.
JacobJans 20 hours ago 0 replies      
A twist on a simple timer / productivity app.

When you want to focus, instead of pressing start, you ask for time from someone else.

Once you've completed using that time, you get to pass it on to somebody else.

Completion depends on approval from the previous owner of the time.

As the time gets used, it gets passed from person to person, creating a "time chain." Participants get to see the history of the time chain. Established users can create new time chains and watch them grow.

iamsalman 20 hours ago 0 replies      
Searchable photo albums OR Google Image search for personal photos.

Basically a way to organize photos into searchable tags which are context aware and auto-generated. This way, I will not have to browse through tons of photos to find the ones shot at a concert, for example. What if I simply search for "concert" and all my photos shot at some concert are fetched. The key here is to auto-tag the photos as good (or near to) as Google does with their Image search.

nashadelic 20 hours ago 0 replies      
A knowledge assimilator: it crawls the web and other knowledge sources and summarizes facts about any topic. The system is thus able to auto-generate a wikipedia-like page for the topic.

If employed in an corporate/enterprise environment, it reads all the documentation and then someone can ask it questions like: "What does the SDP 5.1 do?", "What is the capacity of an SDP 5.1?", "Can I connect an SDP to an SCP" etc.

sarvagyavaish 17 hours ago 1 reply      
Travel AssistantWhile travelling I like to keep family and friends involved up to date about my travel plans - itinerary changes, delayed flights, boarded flight, landed flight, etc. Instead of a pushing updates by texting 3-4 different people, I want to be able to provide an update in one place, say, on the Travel Assistant app, and my family can receive appropriate updates.
zealon 16 hours ago 0 replies      
Ok, here is mine: Prototyper.- Create a webapp that allows the users to choose their phone model.- Based on that, generate and allow downloading of a custom app (Android, iOS) for that model.- The user should be able to customize the software modules for that app and interconnect them, IFTTT-like.

HTH ;-)

coinspotting 22 hours ago 0 replies      
You can look at Firespotting - it is a hacker news for ideas.
miguelrochefort 21 hours ago 2 replies      
Semantic version of Twitter.

Every tweet is RDF.

DanBC 23 hours ago 0 replies      
It would perhaps be better if only one person posted these threads.
vecio 22 hours ago 0 replies      
A specific place to share and watch how other developers or designers work on their projects, by live screen sharing. It should be a good place to learn from real projects development.

I came across this idea after YC said no to my Android screen live sharing service https://shou.tv, a live video streaming service for Android gamers.

hershel 12 hours ago 0 replies      
Tools to reduce distraction in doing research on the web.
nhebb 21 hours ago 0 replies      
A newsletter advertising network.
dmacedo 19 hours ago 1 reply      
Github for databases. I've been brainstorming occasionally about this idea... :)
philip1209 21 hours ago 0 replies      
An app that shares the password of local wifi - e.g. for the coffee shop you are sitting in.

Perhaps neither practical nor legal, but something that would be useful when the barista is feeling snarky and you don't want to return a second time to ask for the wifi password.

ape4 23 hours ago 2 replies      
zipcar/uber: A service that delivers a car you drive to where you are when you want it. Maybe you call up an hour before or via an app. And/or the opposite: you have driven somewhere can don't want the car anymore - eg drinking.
mukeshsoni 22 hours ago 1 reply      
A site which keeps track of which of these 'Idea Sunday' ideas actually got implemented and how are they doing 1 year, 2 years, 5 years or 10 years later. Will give empirical data only how powerful are ideas.
shanacarp 20 hours ago 1 reply      
A peer to peer mortgage marketplace.

Basically, qualified investors own pieces of mortgage in their communities.

I've seen variations of this for student loans - why not mortgages?

hackaflocka 20 hours ago 0 replies      
A Chrome plugin for Hacker News and Reddit that does only one thing: collapse all comments to top level comments.
bnchrch 21 hours ago 2 replies      
Hey, I created a little meteor app today just for this. http://thoughts.meteor.com/
brightsunday 13 hours ago 0 replies      
Bufferbox Duplex - I sometimes find myself wanting to transfer a physical product (books, keys to bikes etc) with someone. Sometimes it is hard coordinating this and I often wish I could just put it into a box from which only they could come and pick it up.

If people are familiar with Bufferbox (YC S12,acquired by Google) this is essentially the duplex version. People can deposit and retrieve packages/items.

One can think of charging people for the amount of time the package remains inside the system or based on the size of the box etc.

Pros - if there is a secure payment system on top of this, one can essentially use this to implement a small marketplace.

Cons - Could be used for exchanging illegal goods as well.

hackerpolicy 21 hours ago 2 replies      
Waze but for supermarket prices.
ddorian43 23 hours ago 2 replies      
html5+flash video player all in one with reasonable pricing and no revenue-share for vast/google ads
X4 20 hours ago 0 replies      
OK, you got me in.

I've got a Software Architecture assignment which allows me to work on any kind of large scale Project (regardless of number of languages and complexity) for 3 Months. The end result will be planned, discussed and evaluated scientifically. We're two experienced devs. Suggestions for ideas are welcome!

hiis 17 hours ago 2 replies      
1) daily email that highlights the top 10 posts on HN (by points or by most comments)

2) simple way to share sensitive financial details with others (e.g., credit card payments, pay stubs, paypal history, bank transactions)

       cached 21 April 2014 15:11:01 GMT