hacker news with inline top comments    .. more ..    18 Jul 2015 Best
home   ask   best   3 years ago   
The Icy Mountains of Pluto nasa.gov
586 points by Thorondor  2 days ago   135 comments top 24
cossatot 2 days ago 2 replies      
Interestingly, they don't look like mountains formed by collisional or extensional tectonic processes to my eye (a geologist who studies mountain formation). They look sort of like diapirs (i.e. lower density material pushing up through a higher-density medium; frozen methane and nitrogen are both less dense than water ice[1]) or vaguely like eroded volcanoes, but without obvious vents. Very different erosional processes could explain some of it, but it certainly doesn't look like typical plate-tectonic or ice-sheet topography. Super cool!

[1]: https://extras.springer.com/2006/978-1-4020-4351-2/Jenam/Ses...

chasing 2 days ago 2 replies      
That's amazing.

The photo of Charon might be the most straight-up gorgeous of the lot so far:


At least to my eye. It's just such a classic shot of a world.

Anyway: USA! USA! (I kid!) Go humanity!

napoleoncomplex 2 days ago 2 replies      
The most exciting part is the following:

"The close-up image was taken about 1.5 hours before New Horizons closest approach to Pluto, when the craft was 47,800 miles (77,000 kilometers) from the surface of the planet. The image easily resolves structures smaller than a mile across."

At its closest, the probe was 7,800 miles away, so we're going to get images way clearer than this in the following days, and even this is amazing to look at.

mholt 2 days ago 1 reply      
ommunist 2 days ago 3 replies      
What is even more interesting, the artist Don Dixon more or less accurately predicted how Pluto looks like in 1979.
etrautmann 2 days ago 4 replies      
The estimate that this area is very new based on the fact that there are no meteor craters assumes that we know the distribution of meteors and space debris in the outer solar system. Does anyone know if we do in fact know this?

It seems plausible that there would be far fewer things at that distance, so the frequency of collisions would be lower.

r721 2 days ago 0 replies      
discardorama 2 days ago 3 replies      
I don't use "mind blown" too often, but I will in this case. Here I am, sitting on my ass munching on a snack, admiring the mountains of a planet 3 Billion miles away. This is so amazing.
dfar1 2 days ago 1 reply      
I am always amazed at how far we've come. I wish I could live 100 years more.
jessaustin 2 days ago 3 replies      
Next century's hottest snowboarding destination...
DIVx0 2 days ago 0 replies      
I did not really know what to expect but these photos are amazing and they're not even from the nearest approach!

Who knew the surface would be so crater free?

zw123456 2 days ago 0 replies      
I have seen a number of articles about the lack of craters and the assumption that this means recent geological activity. But that would assume a similar craterization rate to that of the inner planets like Mars and our Moon. I am wondering, is it possible that due to it's orbit, distance and smaller size that the rate is just a lot slower?

whatever, It is just a mind boggling accomplishment, and the photos really are incredible.

andrewstuart2 2 days ago 0 replies      
Also a cool shot of Hydra, for the first time showing its shape.


erobbins 2 days ago 0 replies      
It's very surprising to me how much (apparent) recent geologic activity there is on both pluto and charon. I had expected them to be much more mercury like.

Pluto was my favorite planet growing up. I never dreamed I'd get to see it up close.

angst_ridden 2 days ago 0 replies      
I can't help it. I tear up and I get all emotional every time I see NASA pictures of a new planetary surface.

To think a bunch of apes can look at and appreciate mountains 6 billion kilometers away... it's awe inspiring!

jmadsen 1 day ago 0 replies      
I can understand why people like my wife look at these pictures and say, "Meh. That's nice."

The enormity of the distances and what humanity has done is, quite honestly, beyond what our minds are actually capable of fully comprehending.

I'll re-post this for your weekend enjoyment:


ape4 2 days ago 0 replies      
Nobody has seen those mountains before!
Schiphol 1 day ago 1 reply      
Having to come up with scientific lessons on the spot ("this may cause us to rethink..."), for the sake of press releases must be pretty stressful. I'm also not sure that such more-or-less impromptu analyses foster the right conception of how science works.
ohitsdom 2 days ago 0 replies      
Amazing. Crater-less mountains, and we have no idea how they were formed (or are forming!). Lots of exciting science ahead!
mrfusion 2 days ago 1 reply      
How do they measure altitude of the mountains without a sea level?
yati 1 day ago 0 replies      
This made me wonder, how do they measure the heights of the mountains? Using data from multiple view angles?
lifeisstillgood 2 days ago 0 replies      

This is a time-lapse series of images of Pluto over the years - from a couple of pixels to today's mountain ranges. In about three seconds you get a full, visual, easy to read justification and explanation for the whole space budget and science funding - it is awesome.

This is what we want to see - go NASA

elorant 2 days ago 5 replies      
Why are most of the pics black and white?
dang 2 days ago 2 replies      
Url changed from http://www.bbc.com/news/science-environment-33543383, which is a fine article that adds some info, but perhaps not enough to override the preference for original sources.
Two.js jonobr1.github.io
597 points by bikeshack  1 day ago   84 comments top 20
Argentum01 1 day ago 2 replies      
Played around with this awhile back and I really like it. Great for quick art projects and the like: http://westleyargentum.github.io/monsters/
rjusher 1 day ago 5 replies      
What I am struggling with, is the use case of a library like this.

Is this oriented towards gaming or substitude something like d3.js or is it has a simplier api, for the developer than other libraries. Or is it a library to showcase the cool stuff that can be done with new technologies.

I think the main difference and the selling point is that it is "renderer agnostic" but I don't understand the benefits of that.

tlrobinson 1 day ago 1 reply      
Interestingly WebGL is way slower than SVG and Canvas in this case (Chrome, rMBP): https://jonobr1.github.io/two.js/examples/particle-sandbox.h...
stared 16 hours ago 1 reply      
As someone using D3.js a lot (even for a puzzle game) - what is the benefit of using Two.js? Jumping between SVG and Canvas? (But then, is it worth the price of reducing possibilities to circles and squares?)
smrtinsert 1 day ago 1 reply      
I don't see anyway of working with textures at this point, am I wrong? Part of what I find interesting of working with Paper.js is the ability to render to a texture, and keep piling on stuff if necessary. I haven't tried the performance that would bring, but from what I understand it could be great.
dexwiz 1 day ago 1 reply      
I notice that this can render ThreeJS objects. Any reason to construct and object with ThreeJS and render it with Two?
victorhooi 1 day ago 0 replies      
Basic question - but when would you use this over d3.js?
chejazi 9 hours ago 0 replies      
Considering turning my text-logo into one of these!
arocks 23 hours ago 1 reply      
Is it possible to use this for creating videos with motion graphics? Even the possibility of a frame by frame render would be awesome.
jarek-foksa 1 day ago 1 reply      
Why Vector.isZero() is checking whether a vector length is smaller than 0.0001? Why this arbitrary number and not e.g. 0.00000001?
mintwurm 1 day ago 1 reply      
Is that project still alive ? Last two commits are from march this year and then there's nothing all the way back to july '14
kodablah 1 day ago 0 replies      
An alternative library that is Canvas only but works well with input is ZRender[0]. Sadly the docs have not been translated to English (lingua franca of open source libs or software in general?) and I think that affects adoption.

0 - https://ecomfe.github.io/zrender/

drumttocs8 1 day ago 0 replies      
How would this compare to Famo.us?
amelius 1 day ago 1 reply      
I'm looking for a library that can create a calligraphic shape (path) from a stroke path.
vegabook 1 day ago 2 replies      
Awesome...but. No text? That suddenly wipes out my use case (data viz). I was super encouraged reading this post, multiple renderers including canvas / webgl (so I assume speed - unlike D3), smart, clean, indeed creative looking logo/site...suggests the author has taste...but no text kills this for me. I see no obvious use case for animated 2d that does not do textures (personally happy to do without) but also does not do text.
dreamfactory2 1 day ago 0 replies      
what are the major differences between this and paper.js?
humbleMouse 1 day ago 0 replies      
That's pretty sweet.
gondo 1 day ago 3 replies      
what is meant by "modern web browsers"?
gprasanth 1 day ago 3 replies      
How this is different from fabricjs or snapsvg?
billkozby083 1 day ago 0 replies      
I'm testing something...thanks
Potato paradox wikipedia.org
544 points by Panoramix  2 days ago   132 comments top 32
ot 1 day ago 6 replies      
This is not that uncommon when optimizing code.

Your program is slow so you profile it, and find out that f() takes 99% of the time. So you work a lot to optimize f(), and re-profiling shows that now f() takes 98% of the time.

Doesn't seem that impressive after all the work you've put into optimizing f(), but your program is actually twice as fast :)

eck 1 day ago 5 replies      
A much more important example of this than "martian potatoes" is uranium enrichment.

Natural uranium is ~1% U235; bombs need 90+% U235. So when you've enriched it from 1% to 2% it doesn't seem like you've made a lot of progress towards 90.

If instead of enriching U235 you think of it as eliminating U238, though, then you've done half of the work.

kzhahou 1 day ago 3 replies      
Another angle on this problem: How much water must you add to the potatoes to make them 100% water?

Of course, you can add all the water in the universe and they'll still not be 100% water. The water-percent increment just gets smaller and smaller, the more water you add.

This "potato paradox" illustrates the same effect, but in the other direction, where a small relative decrease yields a large absolute decrease.

ThrustVectoring 1 day ago 2 replies      
The solution is much more intuitive if you use odds ratios instead of percentage probabilities. You go from a 99:1 ratio to a 98:2 (or 49:1) ratio.

In other words, it's another way of phrasing that it takes twice as much evidence to be 99% sure as it is to be 98% sure. Or that it's twice as hard to have 99% uptime than 98%.

mgalka 1 day ago 0 replies      
Cool concept. Took me a minute to think it through before I could make sense of it.

I think it is the fact that they used potatoes that makes it counterintuitive. Had it instead been a glass of water that had 1% of dissolved salt in it, it would have been very straightforward.

JoshTriplett 1 day ago 1 reply      
The thing the article doesn't point out is why it seems unintuitive.

If you phrased the question as "You have N pounds of potatoes", or with a specific number other than 100, it would come across as less unintuitive. As you read, you see "100 lbs", and "99%", so percents and potato components are both out of 100. So then you see 98%, which is 98/100...

amolgupta 1 day ago 2 replies      
q:100 people are seated in a room. 99% of them are enginners and 1% managers. How many engineers should leave the room to make it 98% enginners and 2% managers?a:50
kwhitefoot 13 hours ago 0 replies      
This is only a paradox because the framing of the question misleads people , partly by mentioning potatoes, into thinking that they are discussing something that one might actually do in the way it is described. In reality if you did this with real physical objects the part where it says:"You let them dehydrate until they're 98 percent water"hides a process in which one of the operations would involve picking up the potatoes. At this point you would notice that they were much lighter than before.

So as stated it is a trick question, the kind of parlour game found in old books of puzzles.

Finally, it seems to be a common failure of education to allow people to go through their 'mathematical' training and leave them with the impression that 'percent' is some kind of dimension when in fact it is short hand for a ratio: y is x percent of z. If z is not specified then you don't know what y is regardless of how much effort you put into discussing x.

So I suspect that in real life there are not many occasions when the paradox appears surprising.

DannoHung 1 day ago 0 replies      
I think it's interesting that just the percentage stated makes it hard to comprehend intuitively.

That is, restate the question with a different end water percentage and the answer is immediately obvious:

> You have 100 lbs of Martian potatoes, which are 99 percent water by weight. You let them dehydrate until they're 50 percent water. How much do they weigh now?

And, of course, it's pretty easy to get "2 pounds", but your brain is pretty fixed on the numbers all clustered together in the other example.

FryHigh 1 day ago 1 reply      
Here is another variation.

A fresh lake gets infested with algae and the amount of algae doubles every day. The algae covers the whole lake in 10 days. How many days did it take the algae to cover half the lake?

blahblah3 11 hours ago 0 replies      
Let p be the percentage of water and m the total mass.

Then m(p) = 1/(1-p) . The "unintuitive" aspect is that we want to think of this function as linear when it isn't.

By taylor's theorem, how bad a linear approximation to this function will be is based on its higher order derivatives. We can get an idea of this by looking at its second derivative:

m''(p) = 2/(1-p)^3. If you plot this graph, you'll see that it really starts to blow up past 0.8, so the nonlinearities start dominating.

guelo 1 day ago 2 replies      
I don't think a simple algebra problem should be called a paradox.
dools 1 day ago 0 replies      
This strikes me as similar to the non-intuitiveness of compounding interest (and related brain teasers like "bacteria doubling population every day" and "pieces of gold doubling on a chess board").
antimora 1 day ago 1 reply      
Another intuitive way of thinking is to think in terms of proportionality between water and potato matter. The weight of the potato matter remains constant, and the amount of water can change, which in our case goes down. To make the matter proportionally twice as bigger compared to water, one needs to divide water by twice.
thetruthseeker1 1 day ago 0 replies      
A better or more shocking way to understand how the numbers interplay is, to reframe this like this.Imagine a vegetable mystical-vegetable, that had 99.9% water and weighed 100 lbs. If I dehydrated it such that it now has 99% water, what would be its weight? Yes, its 10 lbs.
andrewguenther 1 day ago 0 replies      
This equation models the amount of weight lost for various percentages of change:

0.99 * 100 - (0.99-x)(100 - y) = y

This assumes that you are starting with 100 pounds of potatoes at 99% water weight.Here's a WolframAlpha link:http://www.wolframalpha.com/input/?i=0.99+*+100+-+%280.99-x%...

andrey-g 15 hours ago 0 replies      
Took me an embarrassingly long time to figure this out. The key issue that I've latched on to is that they've lost 99 - 98 = 1% of their water content, which is false.
vermooten 1 day ago 0 replies      
This is bs and you're all witches.
soup10 1 day ago 2 replies      
This is only confusing because of the potatoes.

If you said you had a pool that was 99% water, that changed to 98% water, the massive weight drop would be much less surprising.

chris_overseas 1 day ago 0 replies      
This is related to the base rate fallacy, the maths of which is frequently misunderstood and can lead to very real consequences:


jsingleton 1 day ago 0 replies      
Very good. I also like the birthday paradox. https://en.wikipedia.org/wiki/Birthday_problem
powera 1 day ago 1 reply      
This isn't a paradox at all. It's a slightly non-intuitive result.
paulkon 1 day ago 0 replies      
Makes mathematical sense, yet not quite common sense.
golergka 1 day ago 0 replies      
But how is that unintuitive?
taigeair 1 day ago 0 replies      
Any more of these interesting puzzles?
spacehome 1 day ago 0 replies      
I'd call it: "The Potato Algebra Problem"
yummybear 1 day ago 1 reply      
Looks like I found my new diet.
Lejendary 1 day ago 0 replies      
pervycreeper 1 day ago 4 replies      
Why is this a "paradox"? Not sure what this applies to either. It's not even counterintuitive.
iopq 1 day ago 0 replies      
That's not interesting at all. How do I downvote submissions?
How a car works (2012) howacarworks.com
474 points by gshrikant  19 hours ago   112 comments top 20
AlexMuir 16 hours ago 18 replies      
This is my site!

This was a pleasant surprise to find my own site on HN this morning! I wondered why it was getting a few more FB likes than usual today.

I'm happy to answer any questions.

I finished this redesign last week so any feedback is welcome.

The main task was to recreate labels and annotations on the illustrations in SVG format, and to reformat the articles in a way that flows nicely and is responsive, but without needing complex markup in the articles. I'll write about the process if there's interest.

I've previously written a little about this project:



Current traffic is 200k uniques a month and it's taken about two years of steady growth to reach that point.

giancarlostoro 15 hours ago 5 replies      
I always feel so uncomfortable about my car considering how old it is, and I know nothing about it or how it works. Everyone around me just cracks open their hoods and fixes their car, swaps tires, etc. I can't just move stuff around without actually understanding what anything is. I honestly have been wanting to learn more about cars. It also helps keep up with mechanics when they tell me things I don't fully understand, I don't imagine they want to sit around all day answering my questions, they probably have other people to work for. I only vaguely understand the things that have gone wrong with my car and the symptoms.

Thanks for such a beautiful website, and for everyone else sharing some interesting links. Being computer savvy is not entirely helpful when it comes to the mysteries of cars (not entirely anyway).

Btw I would love for there to be an Android app if possible. :)

fauria 18 hours ago 1 reply      
I highly recommend "Engineering Explained" channel on YouTube: https://www.youtube.com/user/EngineeringExplained
jsingleton 17 hours ago 1 reply      
De Lorean Owners Handbook: http://www.howacarworks.com/manuals/doc/d3218-owner-s-manual

Nice. It must have taken guts to build a car in Belfast during the height of the troubles. Not that it worked out that well in the end, but it is an iconic car.

aunty_helen 13 hours ago 0 replies      
This is great. As for the criticism that a lot of the content is old, most of the core concepts can be explained better with older tech as it's simpler. This allows you to focus on what is happening and leave the extra details about what technology a part or process has for a case by case basis. https://www.youtube.com/watch?v=yYAw79386WI is a great example of this.

Saying that, an article on direct injection, common rail diesels, efficient turbo charging, variable valve timing, MAF VS MAP EFI would be worth considering.

Lorento 18 hours ago 6 replies      
Seems to be very dated. Plenty on adjusting carburetors, in fact adjusting all sorts of no-longer-needing-adjusting parts.
Kluny 8 hours ago 0 replies      
Great site! I'm going to add it to my car fixing workflow. Ie, thing breaks -> read relevant article so I understand how the system works -> find a good tutorial -> source parts -> fix.

If you are looking for article suggestions:

- Article about different fluids and their properties. I'm betting that you've covered oil weights already, but how about coolants? Regular coolant has a lower freezing temp than water, but does it have a higher boiling point as well? Exactly how poisonous is it and what things will it damage when spilled? Same for trans fluid.

- Possibly out of scope, but an article about how to source parts for older cars. This has always been a tough problem for me! I go with something like: look up parts schematic from manufacturer's website -> search for part by part number -> try to find the cheapest generic part (though sometimes it's worth it to pay for quality) -> find that generic parts don't exist -> go on a merry hunt for used parts on ebay and at scrapyards.

Syrup-tan 3 hours ago 0 replies      
There are also some great animations on how a 4-stroke engine works from animagraffs.com;


abakker 9 hours ago 0 replies      
A few pieces of feedback. In the suspension section, it would be worth explaining a few things about truck suspension, also. Like solid axle, front and rear suspension, both with Coil and Leaf springs. Also, dampers are incredibly important in large trucks/SUVs because if they fail, body roll has much higher likelihood of causing you to lose control. Finally, a "bounce test" really won't work with many SUVs or truck where there is simply no way to compress the suspension to any meaningful degree with your hands.

personally, with trucks, I like the "speed bump test" where you drive into a speed bump and if the truck/SUV continues to oscillate upon returning to the correct height, your dampers probably need work. Additionally, since most trucks/SUVs use similar oil volume dampers to cars, the life of their dampers is frequently lower than in cars. The ones in my TRD Tundra lasted about 40K miles.

nogridbag 12 hours ago 1 reply      
Neat site. My father was a mechanic and thus I should know all of this stuff, but I wasted away my life behind the keyboard instead :D

I often find videos do a much better job at explaining some concepts. For example, the section on differentials:


After reading this I feel none the wiser. Whereas after watching this ancient Chevrolet training video from the 1930's I feel like I completely understand how diffs work:


matthewrhoden1 13 hours ago 2 replies      
I didn't see the article for when it's 2AM and you can't get that one last bolt off.
ZeWaren 10 hours ago 0 replies      
I love the fact that you can buy the pdf using bitcoins.
csbowe 13 hours ago 0 replies      
The section on how cars are designed is entertainingly out-dated.
chrismartin 12 hours ago 0 replies      
Very nice illustrations, but you are missing important systems that are found on many/all cars from the past 20 years, like front wheel drive, CV joints, closed-loop emissions control, OBD-II and CAN bus, distributorless (electonic) ignition, electric power steering, CVTs and dual-clutch automatic transmissions, and TPMS.

Diagnosing a check engine light by reading OBD-II codes, for example, is something that every owner of a car produced after 1996 will eventually need to deal with.

You should rebuild your content around systems that are found on the majority of cars on the road today.

jshelly 13 hours ago 1 reply      
I was disappointed to see that there is no guide on replacing brake pads for disc brakes. It seems like everyone in the world knows how to do this but me.
kiddico 11 hours ago 0 replies      
I've been looking for a site like this for so long! <3
noipv4 16 hours ago 1 reply      
Nicely done. Man you should really add a section on the Tesla Model S and possibly the Roadster.
dang 19 hours ago 0 replies      
Previously discussed at https://news.ycombinator.com/item?id=4974055.

Edit: after a year or so, reposts are fine. This is just to point out to readers that the site had appeared beforeindeed, was an old Show HN.

donkeyd 18 hours ago 0 replies      
I heard, that in soviet Russia, the car drives you.
Dear Google Mail Team plus.google.com
425 points by ingve  4 hours ago   154 comments top 46
dasil003 4 hours ago 14 replies      
I could see 20% false positives on spam for Linus equating to 0.1% of false positives across the board since I suspect the people emailing Linus are 200 times more likely to be running their own mail server than the general public.
billconan 4 hours ago 4 replies      
very funny, right after I saw this, I decided to check my spam folder just to see if anything important has been filtered out, and I saw an email sent by google marked as a spam:

this email is sent by google when logging in google account from a new machine. they tag their own email as spam ...

Hi xxx,Your Google Account xxx@gmail.com was just used to sign in from Chrome on Windows.

Don't recognize this activity?Review your recently used devices now.

Why are we sending this? We take security very seriously and we want to keep you in the loop on important actions in your account.We were unable to determine whether you have used this browser or device with your account before. This can happen when you sign in for the first time on a new computer, phone or browser, when you use your browser's incognito or private browsing mode or clear your cookies, or when somebody else is accessing your account.Best,

The Google Accounts team

secabeen 4 hours ago 5 replies      
I run my own mail server with full SPF, DKIM and SRS, routing the mail through a relay at a reputable VPS provider on high-reputation IPs. Over the last few months, there seems to be this pattern where if I email someone @gmail who I've never mailed before, they don't seem to ever get it. I wonder if this is the issue.
c5karl 4 hours ago 3 replies      
I've had a problem with false positives in my spam folder for months. A large percentage of the email newsletters I subscribe to end up in my spam folder every day, and clicking Notspam doesn't help. I can Notspam a certain newsletter every day for a week, and then the next day that same newsletter will end up in my Spam folder once again.

I'm starting to think that Notspam signals have no effect at an individual level. Either that or the button is simply a placebo.

Fortunately, the false positives for personal correspondence from individuals are still extremely rare, at least for me.

chbrown 4 hours ago 2 replies      
I'm kind of surprised that Linus uses Gmail.

It's likely that he'll actually catch a Googler's attention, but for many of us, user feedback is not an option.

@jacquesm's http://jacquesmattheij.com/ham-or-spam-gmail-not-to-be-trust... is another recent instance but again, there's no call to action.

Gmail is great for some people, but I prefer having more control, and I highly recommend https://FastMail.com if Gmail is failing to meet your needs.

lutorm 6 minutes ago 0 replies      
Just about the only thing that google persistently misclassifies as spam for me are logwatch emails. There must be a large number of people who don't know how to turn them off...
qmalxp 38 minutes ago 1 reply      
A few months ago, I wrote a simple Android app and put it on the Play Store. Now every two weeks I receive unsolicited spammy emails about ad campaigns and increasing user awareness. Funny how those get through the filter.
LukaAl 4 hours ago 0 replies      
Happened to me as well.Some of the eMail were just "updates" email that I like to receive but if they get lost is not a big deal.But a couple of them were very important one, and to make things worst, they were answer to email in which I was in CC. So, a colleague of mine send an email to someone and CC me. The second person answer and that mail is marked spam for me but not for the person who wrote the original eMail.Doesn't make sense that an answer to a legitimate conversation is by default a legitimate eMail?
incepted 28 minutes ago 0 replies      
After reading this, I went through my spam folder and it's looking overall quite okay EXCEPT that all the comments on Google+ in response to things I posted these past few weeks are marked as spam. All of them.

Yup: Gmail is marking comments originating from Google+ and written by legitimate users as spam.

someguy1233 2 hours ago 0 replies      
Checked all my spam boxes on my various gmail accounts, seems nothing bad has happened with mine.

At least they're nowhere near as bad as Outlook. I have one of my domains on their free Live Domains (grandfathered plan, can't get it for free anymore, similar to google apps) - and 90% of my emails end up in spam, even if they're from a reputable company with sane mail setups such as Digital Ocean, Github or even Google.

To make it worse, with Outlook you can't turn off the spam filter, and it's known that Microsoft sometimes SILENTLY drops emails for various reasons so they never even make it to your spam box...

Sadly I've yet to find any decent replacement mail service for my domains that's free (or very cheap) and of decent quality.

cybojanek 4 hours ago 3 replies      
How much of this is caused by people marking mailing list emails as SPAM instead of properly unsubscribing?
multinglets 1 hour ago 0 replies      
I'm guilty of this too, but it would really be great if we as a society could develop some impulse against putting all of our eggs in the first shiny basket we see.
tortilla 3 hours ago 0 replies      
Wow, just checked my spam folder and there was an important email marked as spam by Gmail. It was from a known contact I had already corresponded with.
datashovel 1 hour ago 0 replies      
I think even Google should fear the prospects of Linus Torvalds on a mission to "fix email".
danbucholtz 37 minutes ago 0 replies      
Interesting. A few days ago I stopped receiving pull request emails as well. Granted I routinely delete these after I process them - but I was alarmed when I checked my spam and saw them there.
scrollaway 3 hours ago 0 replies      
I am subscribed to the wine-devel, wine-bugs and wine-patches mailing lists (https://www.winehq.org/forums). Having the exact same issue.

It seems to very easily flag discussion about .exes as spam, it's really disappointing. It's been several years and the filters haven't improved, despite me religiously flagging spam/not spam in those lists.

In the end I just gave up and set up filters to specifically prevent marking incoming emails on those lists as spam. It misses the odd linkedin invitation, but it's not like it was catching it before...

elevensies 4 hours ago 3 replies      
It might not be related, but I've been seeing some spam in my gmail inbox in the past month. It seems that something has upset the balance. For example, this went to my inbox:

From: [...] Baby <[...]baby@gmail.com>



balls187 3 hours ago 0 replies      
"Check your spam folder" is now a default instruction for automated email notifications.

Luckily it's pretty easy to scan the folder for valuable messages.

However, having to do that is clearly not ideal.

Had a wedding RSVP get flagged as spam.

barrkel 3 hours ago 1 reply      
Mailing lists are a massive source of false positives for spam. I've pretty much given up on trying to use gmail to subscribe to them.
noinsight 2 hours ago 0 replies      
He already got a response from the Gmail product manager, must be nice being Linus.

Notice the comments from Sri Somanchi. He's listed as the product manager here: http://gmailblog.blogspot.com/2015/07/the-mail-you-want-not-...

rectangletangle 4 hours ago 0 replies      
Been having similar issues as of late. A few very important emails got classified as spam; not nearly 20%, but still enough to compromise my confidence in the system.
berberous 4 hours ago 0 replies      
I had a similar issue at some point, where for a month or two, quite obvious "not spam" emails were getting caught in the filter. Nobody else on the internet seemed to be having the issue, and then it suddenly one day stopped happening again. I rarely mark emails as spam/not spam, so I don't think it was anything I did.
OscarCunningham 2 hours ago 0 replies      
I had to disable spam detection entirely on gmail because I was losing important emails every week. So now I just delete spam by hand.
lolo_ 4 hours ago 1 reply      
I've had the same experience with the LKML lately (been subscribed for about 8 months, generally spam filter has had v. few false positives), can't remember exact numbers, but big, big chunks have been incorrectly labelled.

There seems to be a fair bit of spam sent to the LKML, I don't know whether there's been more lately, but perhaps the large amount of email sent to many people for the LKML and the fact there's a decent amount of real spam sent there, combined with a more aggressive setting is an explanation?

mathattack 2 hours ago 0 replies      
I had an email from a Google recruiter go to the Google spam box. I didn't have the courage to send her the screen shot.

The lesson I learned is we still need to review the spam box once every day or two.

intrasight 3 hours ago 0 replies      
All I see in my spam folder is people trying to sell me sex, drugs, credit, and fake Ray-Ban sunglasses. A quick review of 200 spam messages found one false positive from a whitewater rafting company.
holic 4 hours ago 0 replies      
We've been seeing a huge number of messages sent through Google Groups via our Google Apps domain flagged as spam. I'm not sure what has triggered it lately, but it's almost impossible for us to communicate through our Google Groups email addresses anymore.
easyd 4 hours ago 2 replies      
Google recently posted "The mail you want, not the spam you dont":


usaphp 2 hours ago 1 reply      
Wow 3000 spam threads in just 4 days, how can he find time for actual work if he has so many emails to read/reply!
stevewepay 4 hours ago 0 replies      
I get about 600 emails a day because I forward everything* from all my email addresses through a single email account, and I only see 9 spam messages in my spam box, and they are all spam.
ivank 4 hours ago 0 replies      
Funny, I used to get a ton of false positives in my gmail Spam folder (mailing lists, marketing I subscribed to, forwards from another address) but with the recent changes, I have just 1 false positive of 476 in Spam.
oldgregg 1 hour ago 0 replies      
No accident. Google controls enough of the worlds email now that can just turn the screws and capture the whole market.
Zelphyr 3 hours ago 0 replies      
Sadly, this is the typical level of quality for just about all Google products these days in my experience.
userbinator 2 hours ago 0 replies      
I wouldn't be surprised if somewhere at Google, some team are looking at Gmail stats and congratulating themselves on how much more "spam" they've blocked with their latest algorithm.
kasey_junk 4 hours ago 0 replies      
Assumedly my corpus is much lower than his, but I noticed similar things starting some time "this week".
z3t4 2 hours ago 0 replies      
Spam-detection is pretty much solved, with black-lists and white-list. In some countries it's illegal to send spam, so you can safely add them as white-list's.
rn222 4 hours ago 0 replies      
Google Mail Team, please test all future spam algorithm changes against Linus' inbox.
noobie 4 hours ago 0 replies      
I've always had problems with Gmail's spam filter, unfortunately.
guelo 4 hours ago 1 reply      
I fear SMTP is going to go the way of RSS at the hands of these giant corps. Closed protocols within machine gun lined walled gardens are the future. Sorry old idealistic computer hipies, we've failed you.
paulpauper 2 hours ago 1 reply      
How to get a job at Google: add annoying stuff no one asked for, ignore requests users want
applecore 3 hours ago 1 reply      
If they're mostly patches and pull requests, why doesn't this guy just whitelist those email addresses and domains?
dljsjr 4 hours ago 0 replies      
Runaway DMARC policies perhaps?
MichaelCrawford 2 hours ago 0 replies      
I expect that some people mark a message as spam because they personally dislike the sender or because they disagree with them.

Say if Told you that your mother wears Army boots.

troymc 4 hours ago 0 replies      
I guess we'll just go back to manually making filters.
ujjwalg 4 hours ago 0 replies      
I wrote an article about spam filtering sometime ago. Context was "Spam Filtering should not have any False Positives, ever!".

If anyone is interested they can read it here: https://www.linkedin.com/pulse/20141202174235-36852258--spam...

paulpauper 2 hours ago 0 replies      
Gmail is terrible in so many ways - Being randomly locked out of your account, the clunky user-unfriendly interface, the difficulty of marking certain senders as spam, ...
The Webs Cruft Problem telerik.com
356 points by jsnell  1 day ago   153 comments top 34
billyhoffman 1 day ago 9 replies      
There is a major thing going on here that is not mentioned in the article: One department doesn't deliver a website.

Is that "Terms of Service" modal there because a front-end dev thought "interrupting the user experience is a great idea!" No, it's there because of legal. And all those social sharing widgets? They are there because of marketing. And the 5 different ad exchanges? They are there from Sales/BizDev. And that 400KB of JS crap? That's Optimizely and their crappy A/B testing library that the dev team put it. And that hero image that's actually 1600px wide and then resized in CSS to 400px? It's there because that was the source image from the external agency and no one thought to modify it.

The biggest challenge with modern web sites/apps is that they are largely built by committee. Often its not literally built by a committee, but I mean that while multiple departments are all involved and all get to add to the site, but no one is really responsible for the final, overall user experience of the site.

And even if there is a "user experience" or "performance" team, they rarely have the power to change things. A customer of ours is a Alexa top 100 B2C company that provides a market place for buyers and sellers. They get commissions from sales, but a large part of revenue is ads. The "performance team" makes no head way against any of the terrible performance problems with ads because the ads team is judged based on the ads/conversions, not on performance of the page. Even when the ads are hurting the conversion rates of the sales/commissions, the ads team doesn't care. It's total deadlock of departments saying "performance and user experience is not my responsibility, I only do X".

ised 1 day ago 4 replies      
I enjoyed this article. But I have one nitpick.

The author suggest HTTP/2 as a solution to web cruft.

I could be wrong, but I see the HTTP/2 ploy as a proposed way to deliver more cruft, faster.

What do you think is going to be in those compressed headers?How large do HTTP headers need to be? What exactly are they trying to do? I look at headers on a daily basis and most of what I see is not for the benefit of users.

We can safely assume the compressed headers that HTTP/2 would enable would have nothing to do with advertising?

Again, I could be wrong, but in my estimation the solution to web cruft (unsolicited advertising) is not likely to come from a commercial entity that receives 98% of its revenue from web advertisers.

The web cruft problem parallels software bloat and the crapware problem (gratuitous junk software pre-installed on your devices before you purchase them).

The more resources that are provided, e.g., CPU, memory, primary storage, bandwidth, the more developers use these resources for purposes that do not benefit users and mostly waste users' time.

This is why computers (and the web) can still be slow even though both have increased exponentially in capacity and speed over the last two decades. I still run some very "old" software and with today's equipment it runs lightening fast.The reason it is so fast is because the software has not been "updated".

myth_buster 1 day ago 3 replies      
This is incredibly ironic as I've used Telerik components for internal app development in our organization and the amount of cruft that gets loaded is way damn high. The payload is high and round trips are numerous. I ditched all that and developed my own framework from scratch using the open source libraries and managed to reduce the payload and increase responsiveness.

At some point in the past, it made sense to pay Telerik boat loads of money to get libraries that were supposedly plug-and-play but now there are even better solutions available for free thanks to OSS!


 Sounds a lot like Flipboard doesnt it? If youre a publisher and you opt in, you let Facebook control the distribution of your content, in return for a far more performant experience for your readers, and presumably shared ad revenue of some sorts.
This raised some red flags! Making Fbs(Facebook/Flipboard) the content platform just to reduce cruft and responsiveness appears to be a trojan horse and appears to have similar issues as discussed in the Fb Fraud thread [0]. Another possibility is Fbs(Facebook/Flipboard)would become the Comcasts of tomorrow. The distributed nature of the web is what makes it so invigorating and democratic and I think it would be a mistake to go the cable route.

[0] https://news.ycombinator.com/item?id=7211514

stevoski 1 day ago 3 replies      
I was in Kuala Lumpur, Malaysia a few months ago. I wanted to see a film. I searched in Google for films showing in Kuala Lumpur. I found myself on a web page that, while accurately listing movies and schedules currently showing, looked like it had been created around 2000.

And it was fast. No animations, no auto-complete, no infinite-scroll, no JavaScript frameworks. Just the information I wanted, delivered to my phone seemingly as soon as I touched the link. Simple black-on-white text, plain layout.

This made me somewhat sad, because it showed me what the web browsing experience could have been today.

cocoflunchy 1 day ago 6 replies      
Not sure why he took CNN as an example instead of this very article... http://i.imgur.com/4w9TxOw.jpg > 146 requests, 1.8MB transferred.
narrator 1 day ago 1 reply      
As far as the monetization of content goes, content creators are desperate because there is too much information out there. The people who make money with content these days are people who filter content and write for highly specific niches. I guess celebrity gossip and "Hottest Girls of Instagram!" also works pretty well as a content marketing strategy too. Give the devil his due.

One thing I hate about most content I get is how all the big stories of the day tend to creep in. I have a twitter account I use for personal marketing and I never ever post anything off topic or related to politics or whatever the hot meme of the day is. I only post information related to the very specific niche that I am covering. I pulled up my twitter feed recently and there are biotech people commenting on Greece. There are startup gurus retweeting Greece. If it's not Greece it's whatever is the hot topic of the day like <insert social issue that even saying you don't care about will result in social excommunication> or the Ukraine war or whatever. Frankly, I don't care, I don't have the time, it does not affect me personally. I have way too much stuff to think about already. This is why people don't pay for content. The supply is off the charts and the demand is not there and most people are just recycling crap that they read somewhere else anyway.

realusername 1 day ago 2 replies      
(My opinion here, I know not everyone is going to agree). What I really would like to have, is web capabilities without design, I mean the browser should be in charge of most design. What I would like is more meaningful tags like <panel>, <post>, <user>, <description>, <icon>, <horizontal-menu> and let the browser handle the actual representation. CSS could be just used for rough positioning + size indication and background-color could be completely replaced by a 'contrast' tag and the browser would display the color according to the user choice or operating system interface. The website would then adapt itself to the user and not the opposite.
ChuckMcM 1 day ago 0 replies      
And the answer: "Why does CNN show ads? To make money. Why does CNN include tracking services? To learn more about the reader, to show more targeted ads, to make more money. Why does CNN use social media buttons? To get people to share the article, to get more page views, to get more ad views, to make more money."

And as ads make them less money they show more of them, or more intrusive ones. They a/b test where to put them or how to make them "appear" suddenly on the hope of stealing a click, and they try to disguise them as other news stories. But the bottom line is that the ad supported web at the ad rates people can get, is challenging at best and in cases like Dr. Dobbs Journal, well they give up.

jasonsync 1 day ago 2 replies      
It's called "bloat". Maybe "selling out". To many hands on deck. VC's want profitability.

It resolves itself usually. Sometimes.

Websites get bloated over time. Difficult to read. Slow to load. Messy UI. Runaway code. Ads everywhere. People stop visiting. Less bloated alternatives appear.

Slashdot ... MySpace ... been there done that.Reddit, Imgur .. drifting slowly but surely towards bloat.

Mobile apps suffer too. To a lesser extent due to limited screen space. Poorly designed apps, non-native apps, heavy Javascript frameworks, ad popups etc.

Even worse, when a mobile developer decides to build a simple website for the first time.

Install Mercurial, Vagrant, Bower, NPM, Grunt, Mongo, Express, Mongoose, Passport, Angular .. update everything .. cache clean everything .. check your environment settings .. mess around with Heroku .. create a Docker image for easy deployment. Spin up a virtual machine.

Now hand off that 5 page website to someone else when the project is complete. They'll add bloat to bloat.

Better or worse?

The web is old. We're focused on apps. Eventually we'll move back to the web and clean things up.

Mithaldu 1 day ago 1 reply      
Minor gripe: I wish people would stop talking about Ad-Blocker. Pretty much since i started using the web (back when with Opera) I've had a tool available that is much more general and has given me much more control about how the web uses my bandwidth:

An URL-Blocker.

I don't use it for ads, exclusively, although lots of those fall under it too. I use it to block anything i find annoying when i use the web, be it overly big header images, fonts i don't like, Javascripts that are used by many pages to "enhance the experience", and sometimes ads too.

Thinking about it as a tool to only block ads, instead of one to customize the web and block urls themselves seems narrow-minded to me and misses the point.

Animats 1 day ago 2 replies      
Cruft removal suggestions:

1. Read the source of your own web pages. What's really necessary?

2. Do you really need more than one tracker? Does anybody need 14 trackers?

3. "Social" buttons don't need to communicate with the social network unless pushed.

4. Stuff that's not going to change, including most icons, should have long cache expiration times so they don't get reloaded.

5. Look at your content management system. Is it generating a unique page of CSS for each content page? If so, why? Is it generating boilerplate CSS or Javascript over and over again?

6. Run your code through an HTML validator, just to check for cruft.

7. You can probably remove the IE6 support now.

irq-1 1 day ago 0 replies      
An unmentioned alternative is Firefox "Reader View" or Readability [0], which reformat the source post-download. They don't work on all sites and can get somethings wrong. Still, its a solution that's: open, distributed and can't be easily stopped by content producers. Imagine a Firefox mode that always had Reader View on; a sort of filtered/limited web browser.

[0] https://www.readability.com/

ommunist 1 day ago 0 replies      
The web today does indeed look broken. But this (oh, especially CNN) example is only to show that this problem is merely organisational. If there is no QA process in organisation, quality criteria are non-existent. If there are ever shifting bossy MBAs on top design positions, it will always be like that and get worse in time. The only thing we can do as "web crafters" is to make our personal sites suck less. When it comes to job, who will sacrifice job security for the sake of "better user experience"?
d--b 1 day ago 4 replies      
Strange that this is coming from telerik, a company that creates really heavy and closed javascript framework libraries, that you can probably load from their own domain. They definitely have their part of responsibility in that 'cruft'...
alistproducer2 1 day ago 2 replies      
It's what it has always been. Sites/companies that go the extra mile and provide better user experiences get rewarded; other that don't lose market share. At the end of the day, if the content is good enough most people will deal with the crappy load times.
markbnj 1 day ago 1 reply      
I think "the web" in this case really means media sites and some retailers. The media sites don't have a clue how to make money, other than to include all the usual ad networks and tracking scripts and then sit back and hope. The ecommerce guys are obsessed with slicing and dicing visitor behavior and tracking conversions. Seems to me that outside these (admittedly rather vaguely defined) spaces the web is a much cleaner and snappier place.
guelo 1 day ago 0 replies      
The latest stratechery article discusses this same topic from more of a high level industry trends and business forces perspectives, https://stratechery.com/2015/why-web-pages-suck/
AndrewKemendo 1 day ago 2 replies      
Isn't the fundamental issue here that nobody wants to actually pay money for web content?

Solve that and you solve the cruft issue.

cognivore 1 day ago 2 replies      
HOST file ad blocking. Bang! Problem solved.


This works so well I have a vastly different user experience than other people. I rarely see ads and pages load fast.

Of course, you can't do this on the stupid devices we use but don't actually control (phone, tablets, I'm looking at you), so I wish someone would offer a DNS service that blocks these. I'm thinking of creating one for myself at this point.

Touche 1 day ago 0 replies      
There are some strong points in this article but I think it reaches the wrong conclusion. Flipboard and Instant Articles are not an alternative business model. They are a loss-leader, plain and simple. The goal is still to get you on their webpage where you'll see the ads.

Good ole capitalism solves this problem quite easily. CNN banks on its brand to keep people coming to their site but there are plenty of non-terrible news sites on the web so for those who care about a nice user experience can just use those instead.

andreapaiola 1 day ago 0 replies      
Well... I try to do something...


Touche 1 day ago 0 replies      
Here's a first-pass user style sheet that gets rid of most of the noise.

 #breaking-news, [data-video-id], .sibling, .pg-rail, .js-gigya-sharebar { display: none; } body { padding-top: 0 !important; } .nav-header { position: relative; }

FollowSteph3 14 hours ago 0 replies      
I found it ironic that the author talking about web cruft also had some on their website. The header image on my cell phone in landscape mode took up 20% of my screen and stayed even if I scrolled. Anyways I thought that was pretty funny :)
mirimir 21 hours ago 0 replies      
I typically browse via nested VPN chains and Tor, and overall latency is ca. 500-1000 msec. Leaving aside my concerns about privacy and malware, ads and third-party tracking conversations simply don't work for me. My circumstances are unusual, of course. But I suspect that many who connect via geostationary satellites have similar experiences.
natch 1 day ago 0 replies      
Just want to say, that is a really nicely written and presented article. Loved the way it had a good lead-in, clear and well explained examples with good screenshots, and how it dug into the whys and also explored the future without driving some agenda other than its main point of understanding the cruft problem. I realize the point of the article wasn't about quality of content but more about all the clutter around the content, but it bears saying, if the author is here to hear, that this was some really well done content.
msane 1 day ago 1 reply      
What I find really silly is "would you like to subscribe to my blog?" popups which appear over top of the content after scrolling.

Let alone the browser-level modal: "Allow This Person's blog to send you push notifications?"

I've come to dislike the first 5-20 seconds after loading a news article on mobile, while ads and popups are still adjusting the position of the article text.

njharman 1 day ago 0 replies      
That chadburn looks like it has a marble face and seems unusual. Which made me go lookup what a chadburn is https://en.wikipedia.org/wiki/Engine_order_telegraph
williamcotton 1 day ago 2 replies      
The cruft is because the underlying economics of how data is hosted and distributed is flawed. The incentives between advertisers, readers, publishers, hosting and distribution providers are currently misaligned.

In other words, building an info-economic system based on "free" content that is supported by online advertising results in: the people who make the content don't get paid enough and the content itself is horribly mangled by advertisements and other priorities.

However, we can build other kinds of info-economic systems. Perhaps we could follow the model of publishing back when we still used copyrights as an effective way to align the incentives between authors and publishers. Covering the costs of hosting and distribution are a lot easier to manage when the content itself isn't hard-coded in to an ad-selling-machine as delivery options like BitTorrent or WebTorrent are now an option. Permanent seeds could be kept alive at very little cost to publishers. Perhaps we could experiment with royalties, or selling virtual shares in media, or buying virtual goods that list you as a supporter, or allow people to invest in the creation of and share ownership in media...

flinty 1 day ago 0 replies      
The article by Ben Thompson on Stratechery is a good one to go along with this one. https://stratechery.com/2015/why-web-pages-suck/
api 1 day ago 3 replies      
We had a similar problem in 1999. They called it 'portalitis' back then. Go find some shots of the old Yahoo homepage for an example.

Then Google came out with one blank and 'search' and we finally exited that ugly crufty era.

Now we're back!

amelius 1 day ago 0 replies      
Don't worry. Eventually the cruft will get filtered out by AI, just like in adblockers, but in a much more powerful way.

Essentially, the circuits in your brain that remove cruft subconsciously now, will be implemented in software.

Too 19 hours ago 0 replies      
So in summary RSS is new again?
odiroot 1 day ago 0 replies      
But I like my weather widget on news sites. It's the second most useful feature beside the articles.

Naturally it doesn't make sense if it shows weather for a place across the ocean.

emodendroket 1 day ago 0 replies      
The Gruber quote is (as is typical for him, I guess) insane hyperbole. The mobile Web is not "looking like a relic" because I'm not going to go installing an app to read an article; that's more trouble than dismissing the modal yammering on about privacy policies.
Homejoy says goodbye homejoy.com
345 points by philip1209  8 hours ago   368 comments top 77
jclune 6 hours ago 7 replies      
I run a company similar to HomeJoy in Japan, and I disagree with most comments here.

1. Those who claim cleaning is not a "skilled job" should get off their keyboard, spend a day cleaning their moms house, and get back to me when they learn they are 3X slower than a pro and destroyed something with bleach-based spray.

2. It feels like nobody here actually read Adora Cheung's quotes about multiple lawsuits coinciding with investment timing. That is obviously the reason they shut the doors.

3. Those who claim flaws in matching business model are to blame are making non-quantitative assumptions. Just ask yourself, how high of a percentage of users sharing direct phone # would cause a growing company to collapse? Also the supply side risks losing stable income or insurance coverage.

I do quite a number of things much differently than HomeJoy, and quality control being a major one. There are certainly a lot of challenges and operational complexity to keep my team innovating. Japan has the highest customer service standard in the world (and hardest to satisfy customers), which is great for building a more solid foundation. If any hackers are leaving HomeJoy and want to move to Tokyo, I'm hiring!

jasonwilk 8 hours ago 7 replies      
I wanted HomeJoy to work and tried to use the service a few times. What the failure came down to:

1. The cleaners were not professionals. It felt like they were just recruiting anyone who wanted a job. You could really feel this with the lack of passion from the cleaners and the severe lack of quality cleaning. Most of the people complained and were quite rude sometimes. Cleaning is very much a skill as much as it is manual labor.

2. People don't want to let just anyone into their home to clean. Especially for those that have valuables, you want someone you trust who is going to hopefully be your maid for years to come. I wouldn't want someone new every time and for that reason, I used HomeJoy only at our office but even that wasn't enough due to poor quality.

Someone on this thread further explained this well, which is that HomeJoy is at a high level, a match making service. Once you find a match, why do you need HomeJoy? Connect directly with the maid and have them come on your own schedule for a fraction of the price.

I used to feel the same way about Uber. If I found a good driver back in the day I would get his phone # and take him exclusively to the airport. Uber solved this with UberX and by having a wealth of seemingly skilled drivers that made it a true on demand service. Having to book an appointment like HomeJoy seems like it is not a true OnDemand service, just a nicer UI than any other maid connecting service.

AirBnB I feel had a similar problem. For frequent business travelers, finding the right place at the right price is awesome and I would usually try to stay at the same place and connect with people directly. AirBnB solved this with overwhelming demand for the service (the place I like may not always be available) and with their insurance policy and scheduling tools for renters.

In any event, I don't feel like HomeJoy failing is indiciative of a bubble in the On Demand economy. There were inherent principles of this business that made it destined to fail that others in the space won't have a problem with.

fraXis 8 hours ago 24 replies      
I used Homejoy and I liked it's ease of use. But after the cleaning lady they sent me was done, she offered me her direct phone # and told me I could contact her directly for any future cleaning needs.

I always wondered how this business model was going to work if Homejoy's contractors could just give out their phone # at the end of their first service and the customer could just contact them directly for any future needs, instead of going back through Homejoy for any future bookings.

tptacek 8 hours ago 8 replies      
I get the sense that many of the people involved in Homejoy are well-intentioned and hardworking. But I can't say that I think it's a bad thing that tech startups are finding it difficult to monetize unskilled labor.

The technology that most companies like these offer (with the possible exception of Uber) is a commodity. The real asset they have is the network effect. Which makes the balance of power between the tech company and the "1099 contractors" deeply suspicious. What are these companies doing for the laborers that makes them valuable enough to be skimming returns from the work?

callmeed 8 hours ago 2 replies      
So Forbes named Homejoy one of the hottest startups of 2013 (and they make a 30 under 30 list). They raise money from top-tier VCs and angels. Adora does quite a bit of speaking (it seems) on growth, regional expansion, and startup inspiration. They expanded to 30+ markets.

But they were really just selling cleaning services for below coston the backs of 1099 workers. It's a worse model than Groupon and I can't fathom how the founders or investors thought it would work.

It all feels icky to me.

storgendibal 8 hours ago 1 reply      
Higher prices, questionable convenience, and poorer service.

The quality of the cleaning had deteriorated over time and the price was higher than independent cleaners I later found through personal referrals.

I had 4 different Homejoy cleaners. The very first one was awesome and I thought, "Wow, this is great". Then every single cleaner after that was terrible. Two of them started wet mopping before vacuuming or using a dry swiffer, thus pushing wet dirt around. Another one let the toilet brush sit in the toilet in a way that the entire brush, handle and all, fell in. When I came over to oversee her, to make sure she doesn't do other things like that, she got angry and refused to work until I went into another room.

As other's have said, the model sucks. In order to make money, you need to charge above market rates to get your cut. In order to justify that, you need to offer something to both sides of the market. The customer expects convenience and high quality service. If you cannot provide both those things, why would people use Homejoy? And of course, what are you offering the service provider to stay on your platform?

Lastly, there was no Android app and the web app had so many bugs that even the sign-up flow was hit or miss. The sign-up flow! Logging in from my phone never worked and I always had to use a laptop. Unreal.

acabrahams 8 hours ago 4 replies      
I never used the Homejoy service, but I was in the audience for the Startup School Europe talks last year where Adora gave a fantastic speech (Notes: http://theinflexion.com/blog/2014/07/26/notes-from-startup-s...) about going through so many ideas and working so hard to get to Homejoy. The talk's ending had a 'And look, we made it, so you can too' feel, and I had no idea that they were doing anything but crushing it after all those years of grinding work.

Its hard not to be disheartened when a pair who seem to have worked as hard as they have still don't make it with an idea. I just hope they keep going.

gsharma 8 hours ago 0 replies      
I never used Homejoy, as they were pretty pricey. I have cleaners charging me 1/3 of what Homejoy quoted.

That being said, I talked to several (10-15) people who used Homejoy at least once. The responses I got from most of those people was that the cleaners weren't professional. In fact several of them mentioned that cleaners didn't know what Homejoy was. They were sent for cleaning by their contractors. In other terms, the cohort I talked to had a really low NPS for Homejoy.

I think Homejoy wasn't able to nail down that user experience of their real product (i.e. Cleaning) no matter how amazing their on-boarding/booking experience was.

It must be very disheartening for the founders and the team. All the best to the their next adventures!

iblaine 5 hours ago 1 reply      
Their HQ was a mess. If your business is cleaning then you should have cleaning in your blood and dream about pine sol when you sleep. I get the sense that the company was started for the sake of starting any company. In doing so they created a company without any vision.

Look further into the company and you see things like the founders saying working on Christmas Eve is ok. Presentations where they say luck is irrelevant and working hard and smart are the keys to success. That's some American Psycho sh*t.

philip1209 8 hours ago 3 replies      
They've raised almost $40M in funding:


mbesto 7 hours ago 2 replies      
As tptacek said, I'm sure many of the people who created and work for these businesses are well intended and hard working, and we have to applaud them for that.

However, the writing is on the wall for any "sharing economy" service that is simply a technology wrapper for non-SSN'd workers in the US. A couple of challenges that aren't solved by technology:

1. Many unskilled labor positions, especially those that incorporate illegal immigrants, are paid in cash.

2. The price points are absurdly low to create any sort of margin to sustain a business (I can only assume most of these companies are doing < 10% gross margin)

3. Response of workers turns sour when they realize the system inevitably becomes indentured servitude.

4. "Rigorous background checks" - I have yet to see how technology has made background checks any more "rigorous" or how this has allowed companies to scale the quality of workers.

5. Scaling quality - many of these services (moreso for services like Thumbtack) start by hiring skilled people (usually MBA students, aspiring actors, etc) who are looking to earn a few extra bucks for fairly unskilled activities. People enjoy the service since they not only get a higher quality service but also because "it comes with a smile". There is only a limited pool of these type of workers, which inevitably means the supply side of the business gets eliminated at a certain scale.

I think there might be a place for these type of businesses, but perhaps not in the venture world.

zaidf 8 hours ago 1 reply      
My problem with Homejoy/Handy etc. is the high degree of unpredictability in the quality of each cleaning. My conclusion is that for something like home cleaning, you want to find a regular person -- not someone new each week.

Our regular cleaner(which I found through nextdoor) recently had to quit. It was enough to upset me for a bit because finding a quality and consistent and affordable cleaner is very hard. Luckily I was able to find another promising cleaner from nextdoor.

When looking for a regular cleaner, one of the things I have learned is that you want a "career" cleaner. You don't want someone who is doing it as a past time to make some extra cash. That might work with Uber/driving but it doesn't seem to work with cleaning that well. If a non-serious Uber driver cancels the ride, you just call another one. If your cleaner doesn't show up, you can easily lose a day before finding a replacement(and hoping he/she delivers).

birken 8 hours ago 0 replies      
At the end of the day, this is a lesson in unit economics. If you want to make a successful startup, your unit of sale better be super profitable.

Most successful technology companies sell bits, with high fixed costs but very small marginal costs. This means if you grow really big, your fixed cost growth will eventually flatten out but your profit can continue to grow. Facebook, Google, AirBnb, Uber.

If you are selling something in the real world, especially if you are owning the whole process, then your marginal costs are going to be pretty high, which means your profit margin is going to be lower. This is completely fine and a ton of businesses run like this, but these businesses have to be very careful with their fixed costs. You can't grow like your average tech startup because this model is different than most tech startups.

The exact reason they are going out of business is less important than the fact that a business with high marginal costs is very fragile. A slight increase in a high marginal cost can destroy your profit margin, whereas if your marginal cost is extremely low then you are much more resilient.

High fixed costs + low marginal cost = Good

High fixed costs + high marginal cost = Be careful

telecuda 8 hours ago 0 replies      
Homejoy customer here. For me, they succeeded in making home cleaning accessible to a guy in his early 30s who clicked a few buttons past a Facebook ad to schedule 2x/mo cleaning - a big stress reliever and quality of life improver. I never used a cleaning service prior to Homejoy.

Where they failed was in providing an adequate supply of cleaners (no one available for weeks on occasion) and last-minute cancellations without substitutes.

Never did a cleaner solicit me to hire direct, but I DO think Homejoy should leave their customers with a way to reach the cleaners I did like for rehiring. Why not, after all?

ChuckMcM 7 hours ago 1 reply      
And the recode story (http://recode.net/2015/07/17/cleaning-services-startup-homej...) which has a bit more clarity. Trying to fund raise a sharing economy business just after Uber gets a bad court decision on the employee/contractor question is hard. It is almost like having your company go back to being a 'seed round' level risk for some investors.

It also lends credibility to the 'not a bubble' discussions if stage C investors are showing restraint but that is a different discussion.

stevejohnson 8 hours ago 1 reply      
For a house with 4 roommates, this service worked great as a solution to the "tragedy of the common [space]" problem. We've been using them for a year or more now and I'll be sad to see them go. I hope they follow through on putting people in touch with the actual cleaners.
pyrrhotech 8 hours ago 4 replies      
I really loved the idea, and maybe I'm just being cheap, but no way I was going to pay $250+ a month to have my 1296 sq ft house cleaned. That's a $150/month value at market rates in my area. $100/month for a bit of convenience was not worth it at all to me.
binarysolo 8 hours ago 1 reply      
I'm surprised they didn't offer to transition/sell their customers to Handy... that's prolly mid-tens of dollars in lead-gen fees off an active customer list of... thousands of customers probably (as well as a less-active list prolly in the tens of thousands range)?
artag 4 hours ago 0 replies      
There is a ton of analysis in this post about what went wrong. As someone who runs a service marketplace business, I know how hard it is scale this type of business (scaling quality, managing a remote workforce, supply churn, repeat usage and all the other things...). The founders had worked long and hard to succeed (knowing them first hand). They also happen to be incredibly nice people - always willing to help others. All the best to Adora & Aaron. I look forward to seeing what you launch next!
ylhert 8 hours ago 1 reply      
Is this the beginning of the end of the on-demand bubble?
narrator 3 hours ago 0 replies      
I think the problem here is low quality and lack of adequate screening. This seems to be a problem with a lot of online businesses. In a different way it was a problem for PayPal when they started. They were overrun with fraud and had to find clever ways to control it. Anytime you interact with the general public, especially people with out a resume doing cheaply paid work and with the general public internationally you are going to have, among the quality people, a bunch of bad apples, people with mediocre talent and weirdos, disgruntled and otherwise. The value that a sharing economy online service adds is screening out all that and delivering high quality.

When moving into a new area it takes a while to figure out who the trusted providers are. Sometimes the top guy on Yelp is outrageously expensive and overbooked and you have to ask around the neighborhood. Then when you've found the right people you trust, you form a relationship and use them forever and they take care of your stuff. The problem is that once I find my trusted guy for X I just stick with him. Homejoy seemed to be turning that experience into a dice throw.

nicholas73 8 hours ago 1 reply      
I once asked Adora why they are not a marketplace instead. She replied that the branding and bonded/insured cleaners would be more attractive to customers.

Obviously easy to say in retrospect what mattered in the end, but I'm wondering what is a method to test this kind of assumption?

I go by my own use case - I really only care about price, and I build trust by the person and not by the company. I prefer to stay home while it's being serviced as well.

piratebroadcast 5 hours ago 0 replies      
This is the same company that posted the infamous Christmas Eve job opening ad to HN: https://news.ycombinator.com/item?id=8794956
bkjelden 8 hours ago 1 reply      
I think the key differentiator between successful "uber for X" companies and unsuccessful ones will end up being the customer experience in the industry each company attacks.

Before uber, getting a taxi sucked. Talked to anyone who traveled extensively before uber, and they'll have plenty of stories about cabbies who tried to rip them off.

AirBNB is having similar success because getting a hotel also sucked - it was often overpriced, and the quality of the room often sucked, and as a consumer you didn't really have any feedback into the system.

But home services? Most people I know are able to mine their personal network pretty easily to find good home cleaning and home repair services. Or at least, it's a lot easier for them to do that than not get ripped off by a taxi driver.

These companies won't win in every industry "just because". In order for these companies to be successful, they have to bring some improved experience to the table that customers simply won't be able to live without once they've tried it.

Sad to see things come to an end for the team, though. I hope they find great success in whatever they pursue next.

iaw 2 hours ago 0 replies      
I met someone who worked for homejoy and was stupidly compensated, before homejoy he had worked at groupon. Verbatim he said to me: "I never want to work at a company that is profitable."

I had horrible experiences with their service, and I believe that multiple factors contributed to this shutdown, but I can't shake that quote from someone being paid a quarter million dollars a year by homejoy.

zach 7 hours ago 0 replies      
A lesson to take away here is that a startup which faces legal/regulatory threats to their existence has to be small enough to be ignored or huge enough to change the rules.

Uber is probably the first company you think of, with their absurdly large investment rounds, or maybe PayPal, but don't forget YouTube. They went from a little video portal with everyone else's stuff on it to a protectorate of one of the world's largest businesses so quickly the copyright holders had no chance to deprive it of existence.

The danger zone is in the middle. Homejoy didn't expect to be in this kind of trouble, but once they were, it made finding a huge round of funding both necessary and impossible. So this is the rational decision.

ivankirigin 8 hours ago 2 replies      
Cofounder Adora Cheung is worth following. The story is a great example for founders. Check out her lecture at Sam Altman's startup class: https://www.youtube.com/watch?v=yP176MBG9Tk
outside1234 8 hours ago 1 reply      
Pretty interesting analysis around how the funding in the sharing economy is drying up after the "they are employees" decision by California.
tomlongson 8 hours ago 1 reply      
I have a hard time understanding this. The markup and demand is extremely high. Were they just overvalued so couldn't meet expectations?
sprkyco 7 hours ago 0 replies      
Looks like Google may have picked up some of the team:https://news.ycombinator.com/item?id=9904483

So not ALL that bad.

cjf4 8 hours ago 0 replies      
This one's interesting to me, because it seems like they had done all the right things, had a real product, and worked really hard (I think I remember Graham saying she worked as a cleaner part time during YC).

Without knowing any of the specifics, this seems to be a good correction of a couple threads of thought. Most notably, startups are not a science, and not every business that is technologically lagging will be drastically changed by a modern tech platform.

It really comes down to value. Seems so simple but I think the startup community would do well to incorporate the importance of creating value in their pedagogy.

bryanlarsen 6 hours ago 1 reply      
Problem A: good cleaners bypass your service, using you as a cheap referral service.

Problem B: paying cleaners as contractors is legally problematic.

The solution to both those problems is the same: hire your cleaners as employees. Then you can prevent them from free-lancing on the side in a competitive business. In states where a non-compete isn't enforceable (like California), I believe that you can still enforce it while they are employed by you.

soheil 7 hours ago 1 reply      
I live in SF and the few times I used the service I was relatively happy. I certainly thought the cleaning was done better than Handy, but Handy got them beat in pricing. I also thought their UI was much friendlier than Handy. With Handy there is no way to terminate a recurring cleaning without contacting customer service.

In busy places like SF traffic and parking seem to me to be a major hinderance for services like Homejoy. Perhaps if they provided shuttle service of some kind for their cleaners it would solve that problem.

Also there is a lot of customization when it comes to cleaning vs Uber for example. With Uber you just go from point A to B, a pretty well defined problem. With cleaning services it's a bit more complicated, do you clean the light switches, door knobs, under the sofa (what if it's a 1 ton sofa?)... I think people have vastly different expectations to what it means to have someone clean your house. Sure, with Uber you care if they driver is not rude, plays your favorite music and doesn't drive like a lunatic, but there isn't a hundred others things that would significantly influence the experience, with Homejoy there is.

bane 5 hours ago 0 replies      
Having recently spent a bit of time finding a roofer, an all purpose handyman, a drywaller, a cleaner and a yard guy, I finally have a good personal team I can use. But my god it took a couple years to assemble them. And there were quite a few duds along the way. I went through 5 roofers just to get one small job done and finally had my handyman just go rent some ladders and do it.

It sounds like such a first world problem, but finding quality people and reasonable prices to do this stuff is such a pain. I didn't use homejoy to find these folks, I just worked my personal network until I got them, but I can definitely see a market for something like homejoy.

It's a really great idea and it's a shame it didn't work out.

philip1209 6 hours ago 1 reply      
Looks like Google is hiring the Homejoy engineering team:


andersonmvd 8 hours ago 0 replies      
I saw them on How to start a Startup. Was a good lecture. Sad to see they closing the doors.
throwaway239842 7 hours ago 0 replies      
Posting with a throwaway to address a few realities:

1) I used Handy to book a cleaner for the first time -- I loved her, she did a great job, and I immediately cut out the platform and hired her directly. I felt a little bad about it, but not enough to not do it. Handy got a single transaction out of me, for what is now approaching a year of work. Transaction-fee marketplaces work best where there isn't a natural inclination towards an ongoing relationship (Uber, AirBnB, eBay).

For example, on AirBnB, I'm actively looking for a different adventure each time, so it's hard to cut out the platform. When I'm traveling on business, it's the exact opposite, I'm looking for a reliable, consistent experience with no surprises. That's why hotel chains try very hard to ensure a completely consistent experience between stays, and even between hotels -- so that I can book a Westin anywhere in the world and know that I will be getting the exact same bed: https://www.westinstore.com/westin-heavenly-bed.aspx

2) Homejoy is at a disadvantage because they cannot hire illegal immigrants, and cannot pay them under the table. Under-the-table payment, and illegal immigrants, are common in all cash service business, this is no exception (My cleaner is not an illegal immigrant, since she was able to work through Handy, but she always shows up with a different partner, and very few of her partners speak any English -- I suspect that at least some of them are not licensed to work legally). I also suspect that she does not report everything, or anything, on her taxes.

3) Home cleaning is a high-trust business and one where I'm likely to want the same cleaner each time. The marketplace is interested in sending me a different cleaner each time. The company's needs are in conflict with its customers needs.

mslev 8 hours ago 1 reply      
"Unfortunately, we cannot process any refunds or credits at this time."

Ouch. What would cause them to not be able to offer refunds?

arihant 8 hours ago 1 reply      
Could they operate as small business and/or flip the company as a running small business? I imagine home cleaning is sort of loyalty space, where they must have a good number of regulars?

It's also interesting that they are managing a workforce of around a 1000 cleaners, and they burnt through $40 MM. That's the amount of seed Elon Musk needed for SpaceX.

serkanh 8 hours ago 0 replies      
Any type of business that based on being intermediary between buyer and seller run the risk of "losing" customers. This holds especially true with service business. I ran a nationwide lead generation service which basically matches POS/Payment processing companies with retail/restaurant business which was purely performance based. No initiation fee, no monthly fees to enroll with only 5% commission on the closed leads. And guess what; even if i sent the business tens of leads (averaging $7000 value with at least 40% profit) per month they resorted to either not pay or simply became incredibly sloppy on follow ups so i required to be in the loop. I may have not bring in any additional value to acquirer of the service, but for the service provider i was offering them a qualified business opportunity which they would not otherwise obtained.
Simulacra 6 hours ago 0 replies      
I thought HomeJoy was an awesome idea at first, until I started reading about the experiences, and tribulations, of their ...contractors? Not even sure what to call them, other than hardworking people who, when you balance it all out, weren't making that much. I'm sure the CEO and her brother made a killing, but house cleaning is tough work, even for people who do it professionally every day, for a living. I think those people should be employees, they should be given job protections, and they should be treated (and paid) a lot better than they are.
ryandrake 6 hours ago 0 replies      
When I first heard of this company, I thought, "Too small a niche to grow--they'll never survive". This is a service for a very limited market--rich people who don't already have a housekeeper and can't manage to keep their houses clean. Nobody I know socially would actually pay someone to clean their house for them.

Yet, so many customer testimonials here on HN. Are there really that many people so busy/well-off that they can't take a few minutes once a day or so to pick up after themselves and rather pay someone more than their mobile phone bill to do something so trivial? I guess I was wrong about the market size but damn...

jimjamjim 8 hours ago 3 replies      
As a former customer (have not used in years), I actually felt uncomfortable with the cleaners. Not because of anything they did, but because you could just tell they were low income and being paid very little by Homejoy (I think less than $15/hr). It didn't feel right, and I stopped using them in part because of that.
aranibatta 3 hours ago 0 replies      
Rather than looking at it from a purely business perspective, I think that HomeJoy accomplished a lot in its tenure. It set an industry standard and a format that I'm sure will stick for a long time. It made huge strides in quality control and tackled a lot of the problems in the industry, if not completely solving them. You can only hope that Google keeps that spirit alive, and having met her, I have only amazing things from the impression that Adora Cheung left on me. I'm sure whatever she chooses to do next, she'll do it well.
shah_s 6 hours ago 0 replies      
How can they raise $40m and fail so quickly? How did they burn through that much cash? I hope they actually explain why they failed so others can learn from it.
euphoria83 8 hours ago 0 replies      
I tried using Homejoy multiple times but decided against it each time. The reasons varied from their cost, to not getting specific enough services, etc. I think they lacked in implementation, at least a little bit.
nodesocket 7 hours ago 0 replies      
I literally just had my place cleaned yesterday by HomeJoy. I have a small studio apartment in SF, and it cost $110 total. While it was expensive, the cleaner actually did a really great job, though it was a bit difficult communicating (she is Chineese, and not fluent in English).

I honestly never saw this coming, as I assumed HomeJoy was doing awesome. It will be interesting to see how Exec/Handy handle things moving forward.

The startup game is tough!

jondubois 3 hours ago 0 replies      
That's surprising considering how heavily marketed they were. I think everyone assumed that they were going to be a success.
jjarmoc 8 hours ago 4 replies      
When I hear about companies like this going under, I always think how odd it is that I've never previously heard about them.

Then I realize, that may have been part of the problem.

codingdave 8 hours ago 0 replies      
HomeJoy seemed OK, but I never seriously looked at them because I have always been so happy with ServiceMagic (now HomeAdvisor). When you have a solid competitor who has been in business since 1999, you need a strong differentiator. HomeJoy was/seemed more focused on cleaning, HomeAdvisor on repairs and improvements... but it always seemed to me that they were trying to re-invent a wheel that didn't need re-invention.
amerf1 8 hours ago 0 replies      
"I used Homejoy and I liked it's ease of use. But after the cleaning lady they sent me was done, she offered me her direct phone # and told me I could contact her directly for any future cleaning needs."

I used Airbnb and the host told me the same thing, he said book 3 nights and the other 20 pay me cash or transfer the amount to my account and avoid the Airbnb fee

Skrypt 7 hours ago 0 replies      
As happy customers & friends of the company I'm sad to hear this news today. I wish everyone on the team all the best.

I'm curious to one day read a post mortem on the company, and especially about these last few months.

Our last appointment is scheduled for Monday.

Thank you Homejoy.

blake8086 8 hours ago 0 replies      
I actually just had my home cleaned yesterday through this service. I'll be sad to see it go =(
S4M 7 hours ago 0 replies      
I'm surprised. I don't know much about Homejoy but I read once that it was the company in the YC portfolio that has the highest growth, so I thought they would become a unicorn at some point. What happened to them?
7Figures2Commas 7 hours ago 0 replies      
Homejoy was (at least at one point) the fastest growing company in YC history according to PG[1], and if you go back and read the press, particularly around its funding, it was treated like it was already a success.

Its rapid demise is a good reminder that growth isn't profit, and funding from top tier investors doesn't actually signal that you are building a sustainable business.

Incidentally, I have pointed out the employee misclassification issue numerous times[2][3], and wrote last year[4]:

> It's going to be very interesting in the coming years to see which of these on-demand companies continue to thrive because I personally think it's inevitable that many of them are going to be forced to reclassify their workers as employees. I suspect some investors aren't giving this enough consideration in their due diligence.

If investors are now doing their due diligence (gasp) and realizing that many of these portfolio companies are not going to be able to effectively defend against misclassifcation class actions, Homejoy is not going to be the last of these highly-funded on-demand companies to literally hit a wall.

[1] http://www.reddit.com/r/EntrepreneurRideAlong/comments/1uyr6...

[2] https://news.ycombinator.com/item?id=8489834

[3] https://news.ycombinator.com/item?id=8468863

[4] https://news.ycombinator.com/item?id=8709632

kevinkimball 8 hours ago 4 replies      
what happened?
icelancer 8 hours ago 2 replies      
The service was getting increasingly terrible in my market. I am not surprised at all to see it go the way of the dinosaur. Too bad, this type of service has a lot of value to busy professionals and people with families.
handy_nyc 7 hours ago 0 replies      
I can very confidently say that Handy is headed in the same direction Homejoy. Handy is facing a myriad of class action suits in Boston and is facing a number of legal battles with exempt and non-exempt employee suits. I've seen the cash burn in the space and it is very unfortunate senior leadership doesn't pay attention to it. It's all about making it look like hockey stick growth to investors. The cleaning professionals working on the platform are all very unhappy too.
confiscate 7 hours ago 0 replies      
Sorry to hear this guys. Know you guys worked hard on this. Best of luck
jtwebman 7 hours ago 0 replies      
What was the added value over other smaller cleaning services?
h2014 6 hours ago 0 replies      
There are a lot of these types of on demand startups creeping up in Europe so it'll be interesting to see how that plays out.
soheil 7 hours ago 0 replies      
What is the urge to hijack scrolling so high? Is it because people who those developers work for have shitty mouse/trackpad?
minimaxir 8 hours ago 4 replies      
This serves as a strong counterpoint to the infamous "Dear Future Homejoy Engineer" HN job posting: https://news.ycombinator.com/item?id=8794956

Working as a family on a holiday may not be enough to save a startup.

dlu 8 hours ago 0 replies      
What? I had no clue this was comming. I'm surprised there wasn't more signs
cobrabyte 8 hours ago 0 replies      
Crap... really liked this service.
h2014 6 hours ago 0 replies      
I also wonder what their exit strategy was supposed to be?
kzhaouva 7 hours ago 0 replies      
best of luck to the founders on their next venture
blhack 6 hours ago 0 replies      
This really bums me out :( -- My girlfriend and I have been making use of homejoy a lot lately, and it has really been helping us keep up with the house.
samstave 4 hours ago 0 replies      
I really REALLY wanted to use Homejoy -- but only for the folding/ironing of laundry. (I have three kids - so a house of five produces a lot of laundry)

$25/hour with a several hour minimum... Wow - no thanks...

classicsnoot 5 hours ago 0 replies      
1. Those who claim cleaning is not a "skilled job" should get off their keyboard, spend a day cleaning their moms house, and get back to me when they learn they are 3X slower than a pro and destroyed something with bleach-based spray.

Finally someone said it. Some moron posted ITT that 'no one is passionate about cleaning someone else's shit', which I find ironic as this website is populated by many people who clean up other people's shit online. It never ceases to amaze me how easy it seems to think the only job with subtle nuances is your own. Thus far in my life, I have done construction, plumbing, lawn care, film, 'fixing', security, and professional driving (not chauffeuring), and I always thought it cute how the pros in each of those vocations could go for hours about the skill and attention to detail required to be proffecient, then in the same breath speak of another vocation like it is simple. I think it is safe to assume that if someone is willing to pay for a service it probably requires some level of skill.

Full disclosure: in my "unskilled", workaday life I am constantly explaining the immense amount of time, effort, and skill required to make internet fix.

notNow 7 hours ago 0 replies      
Were determined to support you to keep your homes humming and business buzzing, so we will do our best to ensure partners and clients who want to continue to work together get a chance to do so independently of Homejoy.

Is this why they had to terminate their operations because clients and partners managed to cut them out of the loop and to do business independently of their platform?

smt88 8 hours ago 2 replies      
No surprise there. Tone-deaf marketing and there was a horror story a few years ago about the culture.
ianlevesque 8 hours ago 1 reply      
I'm just so glad they didn't say it was an incredible journey.
jwise0 8 hours ago 0 replies      
It looks like they have taken the Steve Miller Band approach [1] to users who are also creditors: http://blog.homejoy.com/faqs/ . Hope you didn't have a gift card!

[1] i.e., hoo hoo hoo, go on, take the money and run

icpmacdo 8 hours ago 1 reply      
This is a pretty big company going under right, a billion + ?
aestetix 8 hours ago 3 replies      
If your company cannot afford to stay in the black with real employees (as opposed to contractors), you need to re-evaluate your business model.

Edit: and the downvotes are rolling in! Would anyone who has downvoted this comment care to share why?

1.5 TB of Dark Net Market scrapes gwern.net
356 points by gwern  2 days ago   73 comments top 10
rcpt 2 days ago 1 reply      
I've had a very slow-moving hobby project to parse and analyze a subset of this data: https://github.com/rcompton/black-market-recommender-systems

So far I've had some ok results along the lines of "91.7% of vendors who sold speed and MDMA also sold ecstasy" http://ryancompton.net/2015/03/24/darknet-market-basket-anal... I am working on extending this to markets besides evolution now.

Moshe_Silnorin 2 days ago 1 reply      
Gwern has a Patreon page now for anyone interested in supporting his research: https://www.patreon.com/gwern?ty=h
b6 2 days ago 1 reply      
gwern, you are an absolute force of nature when it comes to generating and collecting and presenting information in a useful way. Thank you.
curiousg 2 days ago 3 replies      
What amazing work! I am very interested in doing research with Tor and a dataset like this could make my job a heck of a lot easier. I have a legal question though: Are your scrapes text only? Before I work with this dataset, I want to make sure that there's no possibility it contains illegal images (child porn).
branchless 2 days ago 2 replies      
This is why I love the internet. This article has given me a fascinating glimpse into a world I have no idea about.

Author: thank you so much for taking the time to document this.

joshmn 2 days ago 1 reply      
This is so cool. Thanks gwern.

If someone's feeling bored, you're welcome to put the entire archive on a web server for us to look at......

Or maybe I'll just do it.

stevewepay 1 day ago 3 replies      
What is the legality with respect to downloading this file? Could it contain material that would put us at legal risk?
ryanlol 2 days ago 2 replies      
>collating and creating these scrapes has absorbed an enormous amount of my time & energy due to the need to solve CAPTCHAs,...

Have you considered automating this?

SergeyHack 1 day ago 0 replies      
"HOW TO CRAWL MARKETS" section has good tips for general crawling as well.
acosmism 2 days ago 0 replies      
you could also use a library to handle captchas. have a look at 'tesseract'
Keybase raises $10.8M keybase.io
338 points by doppp  2 days ago   120 comments top 37
petesoder 2 days ago 2 replies      
"We have a new goal: to bring public key crypto to everyone in the world, even people who don't understand it."

Love the vision, guys. Huge congrats! Looking forward to seeing this one unfold.

irq-1 2 days ago 9 replies      
Keybase is the wrong way to do a PKI directory.

First, people should have multiple keys/identities by default; multiple identities should be the normal thing everyone does. Single identities will be used by governments to control people. They'll also work against normal communication patterns where people speak differently to different groups (think parents, friends, coworkers.)

Second, matching a name with social media is the wrong way to lookup others. It only works if people put enough personally identifiable information (PII) on their social media accounts, otherwise we won't be sure we found the right person. This matching system encourages PII, works against multiple identities, and associates multiple accounts on disparate systems. Everything you told Facebook, you've now told Reddit and Github, etc.. because the companies can also lookup your (single) Keybase identity and connect the accounts, as can advertisers and governments.

Third, leaving people to manage their own private keys is worse for security than having them managed by others. Software changes, glitches and upgrades should be overseen server-side and managed; there should be a help-desk to contact and backups should actually be done, etc... Multiple identities would make it easy to let an employer manage an 'employee' identity, and Facebook manage -or consume- a public 'personal' identity, and Twitter manage a private 'political' identity. (A PKI version of OAuth and 'Sign in with Facebook'; strong crypto can't fix the UX problem.)

I think the winning solution will be: a distributed, server-side PKI system that individuals and companies can host themselves; a DNS-like distributed registration and searching system; a host-provided web UX for managing accounts. Moving private keys to the client will work better if the local key is signed by a hosted key, rather than leaving all the private keys in the hands of end-users.

bascule 2 days ago 1 reply      
Hopefully they'll be able to address this then:


(from https://twofactorauth.org/)

chinathrow 2 days ago 1 reply      
I just looked at the source of the page:

A) external CSS which might leak about a visitor to Google

B) <!--~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

 No Google Analytics or other 3rd party hosted script tags on Keybase. And this has the added bonus that we'll never be able to serve ad code. \o/ \o/ chris max ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~-->
No 3rd party JS: good.

Still leaking visitor info: bad.

pjlegato 2 days ago 3 replies      
The fatal problem with this plan is that all end user devices are by now hopelessly compromised. Take a look at the Snowden documents, which are now years behind the state of the art.

It doesn't matter if your crypto is both bulletproof and easy to use. They'll just break into your phone or your laptop through a side channel and read the key. And you won't even notice.

fredley 2 days ago 2 replies      
Congrats! Increasingly pessimistic as a British subject that I'll ever be able to use something like this without attracting the attention of my own government though.
hippich 2 days ago 3 replies      
From my limited following keybase.io it seems like this is mostly hobby project. How does one get investment for this kind of projects? Do you need to know someone already? Or do you decide to pull a plunge and create business plan and present it on one of these demo days organized for investors?

I feel like there is a huge gap between potent, but still hobby, project and series A funding.

chrisfosterelli 2 days ago 1 reply      
> The Node reference client to our PGP directory will be retired. (PGP support will continue, of course.)

Can anyone from Keybase shed light on the reasoning behind this? It sounds like you are moving away from programmer-friendly API libraries and into GUI apps. Even if they're open source, as a developer I really liked your up-front API. Will you still be providing those APIs?

ejcx 2 days ago 0 replies      
Good for them. I've been watching them quietly working hard, and it has been like that for a while now. Good luck! Hopefully some projects that make security completely invisible come about from this.
lisper 2 days ago 0 replies      
An open-source, audited, encryption application based on NaCl already exists:


I also have a keyserver that is currently in closed beta. Contact me if you want an invite.

magicmu 2 days ago 0 replies      
This is an amazing pursuit, and I wish you guys the absolute best! Unfortunately, I think public adoption of encryption will continue to be a problem until there is absolutely no barrier -- both in knowledge and UI design -- between the average user and encryption. This is a huge step in the right direction though!
zmanian 2 days ago 0 replies      
One request Keybase friends...

Can we get a forum or mailing list to discuss crypto design decisions, security etc on launch day?

baby 2 days ago 1 reply      
It's pretty awesome how simple it is to use.

But why trust a third party to hold your private key?

Why trust them to not store the plaintext when you encrypt/decrypt with them?

jamra 2 days ago 0 replies      
I once thought of doing the same for medical information. I am really excited and eager to see you succeed. It's a shame that America does not offer some kind of national repository, but at the same time, it's great that you can tie a user's own data to his consent. It all works great for non emergency situations. Good job getting the money. I hope you can get the work done too.
ok_craig 2 days ago 0 replies      
Congrats! Keybase has been making cryptography easier for me and everyone I know.
foote 2 days ago 0 replies      
The Terms and Conditions for Keybase.io seem lopsided (https://keybase.io/docs/terms). Why is that?
davesque 2 days ago 0 replies      
Very timely, considering the possible resurgence of the crypto wars.
snissn 2 days ago 1 reply      
Do they have any sort of monetization plan or strategy?
realusername 2 days ago 1 reply      
I have a small question here for the people who use keybase and it's been difficult to find answers for this. How do you encrypt per email address ? is that even possible ? I would like to use it for a personal project of mine and that would be the use case. (Sorry to hijack this thread, it might be interesting for other people).

Congratulation to the whole team for your hard work !

_pius 2 days ago 0 replies      
Happy to hear this, love what they're doing.
ahuja_s 1 day ago 0 replies      
This could help solve the personal identity verification issues associated with smart contracts and such. This is one of the key problems with the Ethereum project too.
whodges 2 days ago 0 replies      
Wonder just how secure the @keybase.io email part of this is going to work? Would you have to load this email into a client or would you just pull down the email in Node?

Thoughts anyone?

zekevermillion 2 days ago 4 replies      
Awesome! I hope some of this $ will help to shorten the waiting list queue...
lelininkas 1 day ago 0 replies      
"Your mom is on LinkedIn and Facebook, downloading toolbars." I laughed too hard at this. To be honest I still am.
xanderjanz 2 days ago 0 replies      
Doesn't this just set up a central failure point? Does Keybase have a forward secure key system?
Redoubts 2 days ago 0 replies      
https://keybase.io/jobs is timing out (504).


jtwebman 2 days ago 0 replies      
Congrats guys! I had played around with this 6 months ago. Looking forward to seeing what you guys build.
stuartaxelowen 2 days ago 0 replies      
It feels like their growth strategy is "make security decisions for other people".
reviseddamage 2 days ago 0 replies      
>server-side PKI system that individuals and companies can host themselves

It is not possible to deploy this to the mainstream who probably need encryption the most. Keybase's mission iirc is to bring encryption to the masses without having to learn about encryption. It is imo the best alternative to the status quo, i.e: no encryption.

keybase-please 2 days ago 1 reply      
Can someone please send a Keybase invitation to:


This help would be appreciated.

chinathrow 2 days ago 2 replies      
Why the invite thing? Really? Why shorten artificially on who you let in?
higherpurpose 2 days ago 0 replies      
I wish we didn't push outdated crypto protocols such as PGP. So many companies seem to want to invest money or development time into making PGP "better" - with one exception: they can't make it forward secure, which I think is absolutely critical in today's world of companies and individuals "losing everything" in a hack.

Investing in a protocol that will last another 20 years and doesn't even have forward secrecy seems like such a wasted opportunity. It feels like one of those things where 20 years later you think "I wish we hadn't done that then".

OrangeTux 1 day ago 1 reply      
I've still some invites left, who want one?
tunafishman 2 days ago 0 replies      
Really well written. Kudos to whoever wrote your copy!
taf2 2 days ago 0 replies      
Grats @gabrielh!
rnhmjoj 2 days ago 0 replies      
Hacker news managed to kill even keybase.io.
jgalt212 2 days ago 0 replies      
> Chris Dixon championed the deal; hes known for his visionary investments, ranging from Soylent to Oculus to Airware.

Pretty strong language there (and definitely not provably true or fale), but I guess bold phrasing is what SV is known for.

Donate to Chelsea Manning's Legal Defense freedom.press
294 points by hendi_  1 day ago   164 comments top 15
andrelaszlo 1 day ago 7 replies      
I find it very disturbing that you need tens of thousands of dollars at your disposal to get a "fair" trial.

I say "fair", because the output obviously changes with the amount of money you feed the system. Too little money and the outcome will be biased against you, too much money and the outcome will be biased in the other direction. Which amount will give an unbiased output? $50k? $100k?

How can this be considered normal?

"All are equal before the law and are entitled without any discrimination to equal protection of the law." - Universal Declaration of Human Rights 7

"If we look to the laws, they afford equal justice to all in their private differences; if no social standing, advancement in public life falls to reputation for capacity, class considerations not being allowed to interfere with merit; nor again does poverty bar the way" - Pericles (431 BC)

Simulacra 1 day ago 0 replies      
To give another parable: According to Wikipedia, the Weatherman later known as "..the Weather Underground Organization, was an American militant organization that carried out a series of bombings, jailbreaks, and riots from 1969 through the 1970s." They committed acts of violence in the name of doing what was right for America. To save America from the evils of the government. Should we forgive and forget their actions, simply because it was in the interest of making America better? To righting what they perceived was unjust policy?

I contend that's a very slippery slope, because where do we draw the line and say: If you break only these laws, it's ok so long as you're "doing it for America" but if you break these other laws, that's going to far. America should not be an excuse for unlawful behavior.

Laws exist, morally right or wrong, so that they can hopefully be applied equally to everyone. That never happens, but we cannot just turn a blind eye to some crimes versus others because of morality, or "doing what is right" for America. Ms. Manning broke a law willingly, knowingly, with full appreciation for her actions. She has admitted doing so in response to American policy towards gay marriage. She's guilty, and she should be punished.

fruzz 1 day ago 5 replies      
I'm noticing in these posts that those who view Chelsea Manning's actions favourably use the right name and pronouns.

Those who do not view her actions favourably use the wrong name and pronouns.

That suggests that there's overlap between those who do not view her favourably and people who do not view transgender people favourably.

Simulacra 1 day ago 5 replies      
I cannot and will not donate to any defense fund for someone accused of leaking classified material. Regardless of the reasons, moral or otherwise, they committed the crime. Now it's time to do the time. This should not even be a "Yes, but.." situation. Manning is guilty and deserves the punishment.
fche 1 day ago 1 reply      
(Curious what the basis of the appeal is to be.)
toni 1 day ago 4 replies      
So in other words, there is not a single law firm in US which will take on Chelsea Manning's crucial cause without eyeing for money? That's why she's resorting to ask for donations? Is that even possible?
im3w1l 1 day ago 0 replies      
I have started to forget parts of this story. Does anyone know of a good refresher text?
steve19 1 day ago 0 replies      
Justice is indeed blind. Manning had her day in court. Nobody disputes what she did, or that is was illegal. She pled guilty and was convicted.

Now arguing that this is not fair is something else alltogether.

srose3100 1 day ago 0 replies      
Sadly the odds will be stacked against her but I think it's worth donating if you can.
transfire 1 day ago 3 replies      
So I have a question. How does a human being defend themselves from being turned into a woman, if their captors drug them and torture them in such a way that it makes them want to become a woman?

I find it very hard to believe that the U.S. Government will pay for sex change meds and procedures for a convicted traitor, when I can't even get them to buy me an aspirin.

anon3_ 1 day ago 1 reply      
A legal defense? He admitted to breaking the law.

He is a traitor who has endangered lives.

If you want to talk about people's rights - what about the rights of the identities he leaked? Where was there trial? I don't see much sympathy for them.

justwannasing 1 day ago 1 reply      
Sorry. I've already contributed to the funerals of those harmed by what Manning leaked.
megaman22 1 day ago 0 replies      
Perhaps some of the money spent on transitioning her would have been more profitably spent on legal defense?
wtbob 1 day ago 5 replies      
It's remarkable that this has so many points. Why would so many be keen to contribute to the defense of someone who violated his oath and betrayed his country? He wasn't a whistleblower: there are processes in place for whistleblowing, and there are inspectors general who would love to get a good case of misconduct to work on, and he never availed himself of them.

He pled guilty at trial, and received a fair sentence.

chatman 1 day ago 1 reply      
Very difficult to judge from the donation page as to how likely is the success in the case. I would happily donate if I see there is a good chance of a win.
Vim's 400 line function to wait for keyboard input greer.fm
296 points by shubhamjain  2 days ago   228 comments top 40
joosters 2 days ago 16 replies      
But the code works. It supports lots of platforms through choice. Yes, you could make the code prettier if you dropped some platforms. Yes, you could refactor it to use some abstracting libraries that now exist. But the code works. If you rewrote the code, the best result you could end up with is the same functionality that still works. All other possible results are bad. There is nothing to be gained.
jcl 2 days ago 1 reply      
This reminds me of a talk at last year's CppCon about modernizing legacy code. In it, they talked about a core output function of the MSVC standard library that had accreted enough #ifdefs over the years that the function signature alone spanned over 100 lines. It's a great example of how repeatedly making the minimal necessary change can drive code into an unmaintainable mess.

When the team found that the function was impossible to modify to accommodate C99 format string changes, they undertook a lengthy project to factor out the #ifdef'd features using abstractions available in C++. Not only were they able to turn the code into something much easier to modify, but they also fixed multiple hidden performance bottlenecks, so the end result actually ran 5x faster than the C version.


kosma 2 days ago 2 replies      
There's a deeper underlying problem: the bit of code lacks any architecture. Even though libuv and others didn't exist back then, that's no excuse to just pull random descriptors from various parts of the program and stuff them in a select, repeating variants of the same logic over and over. Even creating a consistent notification/callback interface would make the code much more readable, as the underlying pattern is always the same: watch a descriptors and call a function when an event occurs. It's just lazy and ugly.
breadbox 2 days ago 3 replies      
There is nothing wrong with supporting lots of platforms, but those #ifdefs need to be encapsulated in wrappers functions (or macros). This is a classic example of premature optimization, actually. You use one or two #ifdefs directly because you hate to pay the cost of function overhead just to make the code easier to read. (Even though there's practically no point in tiny optimizations just before the code is going to wait for keyboard input.) A few years go buy, a few more situations are done via #ifdef because at least that way it's consistent. Eventually you have a nightmare function like this, where reading it forces you to read every possible version of the function simultaneously.

Encapsulate your #ifdefs people!

AdmiralAsshat 2 days ago 3 replies      
To my knowledge, maintaining compatibility is one of the stated goals of Vim, even at the expense of performance. NeoVim is supposed to strip out some of these antiquated features.
jandrese 2 days ago 2 replies      
This is a classic case of why legacy code is a nightmare to maintain. It's the OpenSSL situation all over again, and one can definitely see the appeal of going through there and ripping out all of the functionality that nobody uses anymore just to make maintenance less of a nightmare. Luckily vim doesn't run suid root on any sane system, so all of this legacy cruft is not as huge of a threat surface on the machine.
Camillo 2 days ago 4 replies      
I'm a bit surprised that they have code like this:

 if (msec > 0 && ( # ifdef FEAT_XCLIPBOARD xterm_Shell != (Widget)0 # if defined(USE_XSMP) || defined(FEAT_MZSCHEME) || # endif # endif # ifdef USE_XSMP xsmp_icefd != -1 # ifdef FEAT_MZSCHEME || # endif # endif # ifdef FEAT_MZSCHEME (mzthreads_allowed() && p_mzq > 0) # endif ))
When they could have written:

 if (msec > 0 && ( # ifdef FEAT_XCLIPBOARD xterm_Shell != (Widget)0 || # endif # ifdef USE_XSMP xsmp_icefd != -1 || # endif # ifdef FEAT_MZSCHEME (mzthreads_allowed() && p_mzq > 0) || # endif 0))
Perhaps they were targeting compilers so primitive that they could not optimize out a "|| 0".

(edited to hide my shame)

riquito 2 days ago 1 reply      
I've just realized that Vim never exited with an error in my whole life, while pretty much all the other editors I've ever used did. While his code is old, it's certainly resilient.
erikb 2 days ago 1 reply      
To answer most questions here: Yes, nowadays there are cross-platform libraries you can use instead of implementing that yourself. And yes code that has grown for centuries and contains unnecessary things or patterns that aren't used nowadays can be refactored. Or at least that's how I'd interpret the article.
saosebastiao 2 days ago 0 replies      
I would much prefer the coeffect system for managing different platforms and capabilities:


It would be so amazing to be able to take all of those ifdefs and turn them into type variants and have the compiler enforce the safety and semantics of the system. And it would be phenomenal for old code to break at compile time when the environment changes, instead of waiting for a bug report to roll in.

So, paging T. Petricek...when are we gonna get it in F#? :) Even better if someone can get it into Rust! That is the perfect type of feature for a systems programming language.

mkagenius 2 days ago 0 replies      
Can't believe people are going mad over a code snippet when they don't know under what coditions they were written and under what coditions the code base is being maintained.
thebelal 2 days ago 0 replies      
For those interested in viewing this code in context is is at (https://github.com/vim/vim/blob/db9bc0b7931d252cf578c1cd298a...)

Also note that RealWaitForChar has been completely removed from neovim since April of 2014 (https://github.com/neovim/neovim/pull/474/files)

gjkood 2 days ago 1 reply      
Does this have a noticeable performance impact for a typical vim end user, using it in an interactive file editing workflow? On any modern hardware?

I suppose there may be some negative performance impact if we need to use it for an automated/batch workflow (I can't think of many where something like 'sed' won't be better suited).

renox 11 hours ago 0 replies      
> Vim tries to be compatible with every OS, including dead ones such as BeOS, VMS, and Amiga.

There are people working on BeOS revival: Haiku OS, they would (rightfully) quite annoyed if vim dropped support for the OS they're trying to resurrect..

Spakman 1 day ago 0 replies      
I suppose it's because I've spent quite a lot of time in the GNU Readline source that this function doesn't seem that bad. Not that I'm wanting to endorse it as great code, of course.

I find it fascinating to look at how old but very well used software develops (often, but not always) in ways that seem completely ghastly.

pkaye 2 days ago 2 replies      
The question is do you think it could have been done better given that they support multiple OS/UI, etc...
rootbear 2 days ago 2 replies      
I assume portability is why the function declaration is old K&R style and not ANSI prototype style. I'm always surprised when I see K&R style in modern(ish) code. I do miss the ability to declare multiple parameters of the same type without repeating the type name, though. Oddly, several modern languages (like D) seem to think that's a feature.
lnanek2 1 day ago 0 replies      
It's ugly, but vim is one of the very few programs I've always seen ready and available on any machine I log into. Even the login shell used/available changes, but vim is always there. So they are doing something right.
jhallenworld 2 days ago 0 replies      
JOE never tries to use poll or select (it uses multiple sub-processes writing to the same pipe to implement input multiplexing). This archaic way works on all version of UNIX, even very old ones which don't have select() or poll(). No #ifdefs needed :-)

Output processing in JOE is interesting: when writing to the terminal, JOE periodically sleeps for a time based on the baud rate and amount of data. This is to prevent too much output buffering so that at low baud rates you can interrupt the screen refresh with type-ahead. If you don't do this at 1200 baud it's a big issue (you could wait for hours if you hit PageDown too many times).

kelvin0 2 days ago 0 replies      
EMACs also does this with a VM made from a DSL written in LISP. Of course, I never use either VIM or EMACS ... I am mostly in VS land 99% of my time. I can't even try to imagine what THAT code most look like.
kyberias 2 days ago 0 replies      
I don't see any Amiga specific code here.
jerf 2 days ago 0 replies      
It would be interesting to run this through the preprocessor and see what it comes out as on a real Linux system or something. I can't really tell through all the #ifs, but it would certainly be much shorter.
dale-cooper 2 days ago 0 replies      
Another "interesting" project to look at is nethack.
jheriko 1 day ago 0 replies      
there is no excuse for this. its trivially refactorable into something easier to maintain with less ifdefs if you are willing to include/exclude files by platform. (and if not, just wrap whole file contents in single ifdefs)

this is a monster that has grown over time without any proper love.

hackuser 1 day ago 0 replies      
I wonder how many examples there are of a more beloved UI (among coders) than Vim's keyboard input. For myself, its exceptional keyboard responsiveness is a essential part of what makes Vim feel so effortless, so hardwired to my brain.

I wish all code was equally buggy.

reilly3000 1 day ago 0 replies      
What are some examples of open-source project that are in daily use by millions of folks that don't have some spaghetti code? I am sure there are some out there, but I really can't think of any at the moment.
noobermin 2 days ago 0 replies      
To be very frank, I've seen much worse. "Functions" more than 400 lines long, perhaps even 1000 or more, riddled with more macros like in this sample than you can shake a stick at. I'd share a sample of this ghastly stuff, but it's closed source, alas.
mrbill 2 days ago 0 replies      
Speaking of legacy platforms, anyone know of a vi/tinyvi/etc work-alike for CP/M?

I'm going to be doing some hacking on z80 assembly via Yaze, and it would be great if I could have the editor I'm used to, natively on the platform (rather than having to edit in OSX or whatever).

fibo 1 day ago 0 replies      
If I use vi, vim or neovim as a user I eill not look at the code first, I will look at availability. If I am working on a RedHat server for a customer of mine I will use vi cause it is already installed.
MBlume 2 days ago 0 replies      
I'd like to see what the code looks like after it's been through the preprocessor in a modern Linux environment -- I'm guessing a hell of a lot shorter?
andrewchambers 2 days ago 0 replies      
ifdefs inside functions create monstrosities, it is far better to have few preprocessor conditionals, and instead have platform specific c files.
throwaway3648 2 days ago 0 replies      
Just getting window resizing across curses implementations took me 200 lines of C
aceperry 2 days ago 0 replies      
First glance at the headline had me wondering, "Is this a good thing?"
vmorgulis 2 days ago 0 replies      
ncurse is similar with a lot of legacy code to support old terminals.
daxfohl 2 days ago 0 replies      
So when will 16K be a standard resolution?
amadeus 2 days ago 0 replies      
Yet another Vim bashing post by the Floobits bros.

It's getting old.

PSeitz 2 days ago 0 replies      
Do one thing and do it well
cat_dev_null 2 days ago 2 replies      
If you are all such experts, why don't you "fix it" and then pull request it in.
rando3826 2 days ago 2 replies      
Emacs C code is far better! And it's only ~22% c code instead of vim's ~66%


manis404 2 days ago 1 reply      
Really, why does Vim still exists in the first place?
The New York Times uses WebRTC to gather local IP addresses webrtchacks.com
243 points by DamienSF  2 days ago   119 comments top 22
dzlobin 2 days ago 2 replies      
Forum post from Dan Kaminsky, co-founder of WhiteOps[1][2]:

"Dan Kaminsky here, my apologies for kicking up a ruckus. This is part of a bot detection framework I've built at White Ops; we basically are able to detect browser automation using resources exposed in JavaScript. Nothing dangerous to users -- or we'd go file bugs on it, which we do from time to time -- but it does provide useful data regarding post-exploitation behavior. Happy to jump on a call with anyone concerned or worried; I'm over at dan@whiteops.com."

[1] http://www.whiteops.com/company[2] https://isc.sans.edu/forums/STUN+traffic/745/2

AdmiralAsshat 2 days ago 4 replies      
Just a friendly reminder for anyone using uBlock Origin on Chrome or Firefox that you can now configure it to prevent webRTC from leaking your real IP:


You do need to enable this. After reading the article I immediately checked by dashboard and saw that the option was available, but unchecked.

Wilya 2 days ago 1 reply      
A whois on the domain serving the offending javascript leads to White Ops[0], who seems to sell tools to protect against Ad Fraud. So I'm guessing this is part of their fingerprinting system, to determine whether I am a human or a bot.

[0] http://www.whiteops.com/

userbinator 2 days ago 2 replies      
I believe that WebRTC, just like JavaScript, should be disabled by default and enabled only on sites that you really trust and need it; and in the case of WebRTC, the argument is much stronger since its use-case is so specific.
decasteve 2 days ago 1 reply      
Ironic that loading up this site, webrtchacks.com, Tor Browser warns me: "Should Tor browser allow this website to extract HTML5 canvas image data?"

I've now given up on "naked" browsing of the web and only surf via the Tor Browser Bundle. I use a standard Firefox only for web development.

_joev 2 days ago 0 replies      
Here's a tool I wrote that grabs your internal IP and scans your LAN using response timings and HTTP asset fingerprints:

Demo: http://joevennix.com/lan-js/examples/dashboard.htmlCode: https://github.com/joevennix/lan-js

If you are interested and have some time, find and contribute HTTP "fingerprint" assets from devices on your LAN to src/db.js.

joshmn 2 days ago 1 reply      
I said this when the vulnerability/bug/whatever you want to call it was posted here: I use the same method for fraud detection, and it works unreasonably well.

That said, I'd rather there be permissions surrounding WebRTC, but my clients are happy.

joosters 2 days ago 1 reply      
Can they grab local IPv6 addresses using this? While a huge number of computers are going to be on, their IPv6 address could actually be unique, making user fingerprinting easier.
proactivesvcs 2 days ago 1 reply      
I recently added tagsrvcs.com to my Privoxy blocklist. Source site? ycombinator.com.
mastre_ 2 days ago 0 replies      
On OS X, Little Snitch catches this in Chrome, as it would in any browser https://i.imgur.com/hWmpc42.png
ised 1 day ago 2 replies      
www world really needs more www "browsers", particularly some more that do not implement javascript. Would it hurt to give users more choice and see what they choose?

Only my opinion but there is much one can do without all the .js

I certainly do not need Javascript to fetch some newspaper articles via HTTP.

jmount 2 days ago 3 replies      
WebRTC, a protocol proposed by Google to W3C has applications in user tracking and detection of bots. Cui bono.
phragg 2 days ago 2 replies      
So wasn't everyone up in arms about WHOIS recently but seemingly uses the service to identify who wrote this script?
btown 2 days ago 2 replies      
The only possible reason I can fathom that this would be useful would be for tracking unique users behind a NAT (i.e. corporate or educational) who block all cookies. Seems like a pretty niche edge case in the U.S., but I'd imagine this could be useful in, say, the EU where cookies are opt-in by law?
donohoe 1 day ago 0 replies      
To be clear, its not a developer at the NYTimes that has implemented.

It looks like the script in question is hosted on a domain ("tagsrvcs.com") that Adobe uses when loading JS assets for Omniture.

This is very likely a standard Adobe Omniture thing. So its not the NYT acting alone (or necessarily with awareness of this).

itistoday2 2 days ago 2 replies      
Why are they doing this?
beedogs 2 days ago 0 replies      
Reasons to block javascript, #12395 in a series.
api 2 days ago 3 replies      
It's easy to gather local IP addresses. WebRTC is just one of dozens of methods of doing this. Others include various DNS tricks, reverse TCP traceroute, <img> tag tricks, JavaScript/XMLHttpRequest tricks, etc. Private IP addresses (10.x.x.x) are not all that private.
x0x0 2 days ago 2 replies      
there's really no way in chrome to disable webrtc? That's amazing.

edit: from the horse's mouths https://code.google.com/p/chromium/issues/detail?id=457492

edit2: you can install this


and test here:


though google sure seems to be dragging their feet on this so I'm sure they'll break this workaround soon

jgalt212 2 days ago 2 replies      
Here's another White Hat use case for local IP addresses.

You can use it to unobtrusively monitor license compliance for a SaaS biz. You charge each user. A user is constantly logging on from multiple browsers during the day (e.g. IE and Chrome). With local IP knowledge you can determine whether or not this is being done from the same machine (still abiding by license terms), or from multiple machines (most likely sharing with a colleague and breaking license terms).

Before this webRTC hack the only other way to do this that I am aware of, is via the dreaded Flash cookie.

dsjoerg 2 days ago 0 replies      
i've had a bit to drink, can someone ELI5 this to me?
1ris 2 days ago 1 reply      
In other news: If you create a IP connection the other party knows your IP-Address. With WebRTC some parts of this ugly NAT madness are gone.

Nothing to see here.

Teslas Model S Gets Ludicrous Mode, Will Do 0-60 in 2.8 Seconds techcrunch.com
222 points by arturogarrido  7 hours ago   249 comments top 20
stefanobernardi 7 hours ago 10 replies      
Even if not a software update, I still find Tesla's way of thinking of a car as an actual (continuously update/upgrade-able) product fascinating. Very rare for hardware.

It gives an amazing user experience, "Hey restart your car and it's now got X and Y". Respect.

Edit: clarifying the term "product".

rjusher 6 hours ago 5 replies      
One thing I don't understand is the position of the big car makers(BMW, MB, AUDI) being a passive observer in this field.

Tesla is getting so much, that if any automaker that can deliver a car with half the specs of a model S, and keeping their model's prices as a mass produced car, would deliver a big punch to Tesla, and would greatly move the market forward.

Is it the investment necessary for building a network of charging stations?

I highly doubt it is because Tesla has more money for R&D than any other car maker.

Is getting a Model S, earns you the title of being an early adopter. Because I believe the market already shifted towards this type of vehicles, but I may be polarized, because I already desire an electric car.

csense 6 hours ago 1 reply      
I like the Spaceballs reference. https://www.youtube.com/watch?v=mk7VWcuVOf0
rdl 6 hours ago 2 replies      
If you can do this as a $10k update "safely" with a computer-controlled pyro fuse, I wonder what you could do by hacking the firmware and replacing it with a piece of busbar. 2 seconds?
liamk 7 hours ago 3 replies      
Amazing, that's as fast as the Lamborghini Aventador https://en.wikipedia.org/wiki/Lamborghini_Aventador
vvanders 6 hours ago 1 reply      
More impressive to me is the fact that the 90kWh battery is already trickling down to the Model S. It was originally developed for the Model X.

I think it's safe to say that we'll see Model X features moving down to the Model S much faster than previously thought. This is how you relentlessly improve a product instead of holding specific features to a Model without solid technical reasons.

leeoniya 7 hours ago 8 replies      
so it's a fuse that costs $5k or $10k. is it made from 5oz-10oz of solid platinum? what am i missing here [1]?

[1] http://www.amazon.com/s/ref=nb_sb_noss_1?url=search-alias%3D...

fowlslegs 3 hours ago 1 reply      
Seems like they took inspiration from the branding on the Chromebook Pixel LS (Ludicrous Speed)--the 16GB RAM, i7 one that I'm lucky enough to have ;-)

Fun fact: most cars today run about 20 separate operating systems. My guess is Tesla is fairly above average. Anyone have a figure on this?

Animats 6 hours ago 1 reply      
More usefully, there's an option to increase the range to 300 miles. That's still only half the range of many stock gasoline cars, though.
mhurron 6 hours ago 1 reply      
Does it cause it to go to plaid?
devy 6 hours ago 1 reply      
It's a $10k upgrade, not cheap! I wonder which Ludacris song[1] was the hold music looping before today's Tesla press conference call...

[1] http://www.theverge.com/2015/7/17/8994519/tesla-ludicrous-sp...

burger_moon 5 hours ago 2 replies      
0-60 times are nice but I want to know what the 60-100 time is. That's what generally used for 1/4 mile estimates.

Is there any diagrams of information easily available about the couple style used between the motors and the wheel hubs? I wonder how they're done and what the limit of them is with the amount of torque they put out. In a previous life I used to build a lot of race cars and shearing axle bolts wasn't uncommon in drag applications.

It would be awesome to see them test out a full on drag tesla.

Jun8 5 hours ago 0 replies      
On a different note: is it possible to opt out of Tesla software updates (for whatever reason)? AFAIK, this is not possible (http://my.teslamotors.com/forum/forums/same-distasteful-appe...) The two options when an update is available seem to be install now or later.
myth_buster 6 hours ago 8 replies      
Tesla's seems to be beating a dead horse! Don't get me wrong; I love what Tesla does but making a faster production car seems to be an answer to the critics who I think have already taken home the message that the electric car is as fast as the best IC car!

I think what they should address is getting out the affordable Model 3 as fast to the market as they can. They may loose the game if the incumbents beat them to it. They would need volume to sustain their production and maintenance costs. Once every manufacturer gets onto the electric wagon, a lack of wider adoption, could become their Achilles heels.

aswanson 2 hours ago 0 replies      
I wonder if it will have a "Drake" mode to do 0-100 real quick.
mrfusion 6 hours ago 3 replies      
How can they get that many amps out of the battery? It seems like you'd need a capactior to have that kind of power?
braum 4 hours ago 0 replies      
and still no Model X or even serious updates about it... and yet I will still let them keep my $5k until it arrives...
curiousjorge 6 hours ago 2 replies      
this alone will be a good enough excuse to get this car if you could afford it.

Asides Nissan GTR, supercars running in 300,000 USD will do roughly 3.0~3.1. hypercars like pagani, p1, porsche, laferrari achieve sub 3.0

I'm extremely impressed. Even more because this was just another regular software update.

jkot 6 hours ago 0 replies      
Where is source code?
ck2 6 hours ago 2 replies      
No-one needs this, I predict accidents.

I hope insurance companies can determine which models have this and inexperienced drivers should pay more.

Just hope no-one kills anyone.

Ask HN: Good Python codebases to read?
254 points by nekopa  1 day ago   149 comments top 36
aaronjgreenberg 1 day ago 4 replies      
Jumping on the Kenneth Reitz train, you might check out The Hitchhiker's Guide to Python: http://docs.python-guide.org/en/latest/

He recommends the following Python projects for reading:

* Howdoi (https://github.com/gleitz/howdoi)

* Flask (https://github.com/mitsuhiko/flask)

* Werkzeug (https://github.com/mitsuhiko/werkzeug)

* Requests (https://github.com/kennethreitz/requests)

* Tablib (https://github.com/kennethreitz/tablib)

Hope that helps---good luck!

jonjacky 1 day ago 2 replies      
Peter Norvig's examples. They are quite short and include much explanation in addition to code. They also include tests and benchmarking code.

http://norvig.com/lispy.htmlhttp://norvig.com/lispy2.html Lisp interpreter)

http://www.norvig.com/spell-correct.html (Spelling corrector)

http://norvig.com/sudoku.html (Sudoku solver)

Also his online course Design of Computer programs includes many short, well-explained Python examples:


clinth 1 day ago 2 replies      
Requests - https://github.com/kennethreitz/requests.

How to make a usable api. The decisions that went into each method call were fantastic. Great test coverage as well. I use package in most python development.

jscottmiller 1 day ago 0 replies      
Bottle: https://github.com/bottlepy/bottle

It's a nice, small, fast web framework. Great for building APIs. Also, it's one ~3k loc, readable file.[1]

[1] https://github.com/bottlepy/bottle/blob/master/bottle.py

svieira 1 day ago 0 replies      
Several good ones have already been suggested, but here's a few more:

- https://github.com/mahmoud/boltons : utility functions, but well documented

- https://github.com/KeepSafe/aiohttp : a Python 3 async HTTP server

- https://github.com/telefonicaid/di-py : a dependency injection framework

- https://github.com/moggers87/salmon : a fork of Lamson (which was written by Zed)

Python's internals are pretty darn open, so here's a few suggestions that push the boundaries of meta programming in Python - they're not the idiomatic code you're looking for right now, but later, when you know the best practices and you're wondering what is possible they'll be good to look at:

- https://github.com/Suor/whatever : Scala's magic `_` for Python

- https://github.com/ryanhiebert/typeset : Types as sets for Python

- https://github.com/AndreaCensi/contracts : Gradually typed Python (akin to MyPy)

- http://mypy-lang.org : Gradually typed Python - the future (at least right now)

nyddle 1 day ago 4 replies      
Flask - https://github.com/mitsuhiko/flask. It's small, awesome and digestible.
spang 1 day ago 2 replies      
The Nylas Sync Engine is a large Python codebase with a test suite: https://github.com/nylas/sync-engine

Lots of examples of SQLAlchemy, Flask, gevent, and pytest in action to build a REST API and sync platform for email/calendar/contacts data!

travisfischer 1 day ago 1 reply      
A large Python project that I haven't seen mentioned by others but that I find to be particularly well written and designed is the Pyramid web framework.

* https://github.com/Pylons/pyramid/

shoyer 1 day ago 0 replies      
I recommend PyToolz, "set of utility functions for iterators, functions, and dictionaries":https://github.com/pytoolz/toolz

The functions in PyToolz are short, well tested and idiomatic Python (thought the functional programming paradigm they support is not quite so idiomatic). I recommend starting with the excellent documentation:http://toolz.readthedocs.org/en/latest/

In particular, the API docs have links to the source code for each function:http://toolz.readthedocs.org/en/latest/api.html

jordigh 1 day ago 0 replies      

By design, Mercurial has almost no dependencies, so it's very self-contained. I find this makes it a particularly easy codebase to get into.

If you're interested, I would love to walk you (or anyone else!) trough it.

mattwritescode 1 day ago 2 replies      
The django project is a good example of a large opensource project which has aged well. http://github.com/django/django
feathj 1 day ago 2 replies      
Check out boto. It's Amazon's official library for interacting with AWS. It is written and tested well. I use it every day.


aaronchall 1 day ago 0 replies      
Here's the link to the Pandas DataFrame source: https://github.com/pydata/pandas/blob/master/pandas/core/fra...

We spent a month of Sundays going through this in the NYC Python office hours. You learn a lot about this object by reading the source, and the WTF per minute rate is fairly low.

The style is also fairly non-controversial.

giancarlostoro 1 day ago 0 replies      
I see nobody has recommended CherryPy:http://www.cherrypy.org/

It is a minimal web framework like Sinatra or Flask. The beautiful thing about CherryPy is you write code for it the same way you would write general Python code. I enjoy using it for small projects from time to time.


Bickbucket Repository: https://bitbucket.org/cherrypy/cherrypy/overview

mkolodny 1 day ago 0 replies      
Guido van Rossum, the creator of Python, co-wrote a web crawler in under 500 lines: https://github.com/aosabook/500lines/tree/master/crawler

It's especially interesting because it takes advantage of new Python features like asyncio.

d0m 1 day ago 1 reply      
The pep8 standard is also an easy read with so many useful explanations:


veddox 1 day ago 1 reply      
1. The Python standard library (if you're on Linux, /usr/lib/python2.x or 3.x, depending on your version).

2. The Bazaar VCS is written entirely in Python, is very well documented and has a large test section. (www.launchpad.net/bzr)

rasbt 7 hours ago 0 replies      
I wholeheartedly recommend [scikit-learn](https://github.com/scikit-learn/scikit-learn) - the best organized and cleanest code I've seen so far. It is really organized and well thought-through.
patrickk 13 hours ago 0 replies      
youtube-dl: https://github.com/rg3/youtube-dl

I fell in love with this project after discovering I don't need ad-choked, dodgy sites to download Youtube videos/mp3s. It also acts as a catch-all downloader for a huge amount of other video hosting sites, despite the name. If you want to learn how to scrape different videos from many platforms, look at this:


notatoad 1 day ago 1 reply      
I learned a lot about python by reading through the tornado codebase. it's pretty easy to read, well broken up into functions, and not too big.
deepaksurti 1 day ago 2 replies      
NLTK: https://github.com/nltk/nltk with the documentation at http://www.nltk.org. I found the code easy to follow through.

I referred to it when adding tests for tokenizers in a common lisp NLP application: https://github.com/vseloved/cl-nlp/.

mapleoin 1 day ago 0 replies      
Also, how about examples of good web applications built on python with available source code?

Rather than seeing the code of great libraries, I sometimes want to see how people use them in the real world.

cessor 1 day ago 0 replies      
I would recommend reading about the Tornado WebServer. It features some nice stuff such as coroutines, async stuff.


mahouse 1 day ago 0 replies      
For the web developers out there, what do you think of reddit? Any honest commentary on it? https://github.com/reddit/reddit
tzury 1 day ago 0 replies      
1 - (web, network, file system, and more).

 tornado - tornadoweb.org github.com/facebook/tornado
2 - scapy

 The entire product runs on the python CLI secdev.org/scapy

ericjang 1 day ago 1 reply      
NetworkX - https://networkx.github.io/Good example of object-oriented programming patterns (mixins) in Python and module organization.
rwar 1 day ago 0 replies      
danwakefield 1 day ago 2 replies      
Openstack does a large amount of testing for their code[1] but they is a huge amount of it. Barbican[2] is one of the newer less crufty components.

[1]: https://github.com/openstack/openstack[2]: https://github.com/openstack/barbican

edoceo 1 day ago 0 replies      
Gentoo Portage package manager. Its a big project, lots of moving parts, actively developed. Really helped me with learning Python
NumberCruncher 1 day ago 0 replies      
Im just a user and not a contributor (I dont know the source code), but the project is following good webdev techniques

* https://github.com/web2py/web2py/

* https://github.com/web2py/pydal

misiti3780 1 day ago 0 replies      
Django, Tornado, Sentry, Newsblur
rjusher 1 day ago 1 reply      
I would also add

*Twisted (https://github.com/twisted/twisted)

For async python.

SFjulie1 1 day ago 1 reply      
Really the hard way?

https://hg.python.org/cpython/file/3.5/Lib/collections/__ini...Knowing specialized data structure in a language is always important and this is well coded.

Well see the 2 lines I pointed in one of the standard library in python and you will understand that even in good language there will always be dust under the carpet at one point.https://hg.python.org/cpython/file/5c3812412b6f/Lib/email/_h...

gcb0 1 day ago 0 replies      
good examples with a (hopefully multiplatform) GUI?
United Airlines awards hackers millions of miles for revealing risks reuters.com
226 points by doppp  1 day ago   105 comments top 13
Tiksi 1 day ago 8 replies      
Taking a quick look at http://www.united.com/web/en-US/apps/mileageplus/awards/trav... it seems 1 million miles ~= $3000-$12000 depending on destination, date, etc. It doesn't seem all that bad of deal, except plenty of people don't fly often enough to even make use of this, and you have to fly united, which for me would kill all the value.
drallison 1 day ago 5 replies      
The rules of the bug bounty program disallowed many of the usual red team approaches to finding possible exploits.

Attempting any of the following will result in permanent disqualification from the bug bounty program and possible criminal and/or legal investigation. We do not allow any actions that could negatively impact the experience on our websites, apps or online portals for other United customers.

 .. Brute-force attacks .. Code injection on live systems .. Disruption or denial-of-service attacks .. The compromise or testing of MileagePlus accounts that are not your own .. Any testing on aircraft or aircraft systems such as inflight entertainment or inflight Wi-Fi .. Any threats, attempts at coercion or extortion of United employees, Star Alliance member airline employees, other partner airline employees, or customers .. Physical attacks against United employees, Star Alliance member airline employees, other partner airline employees, or customers .. Vulnerability scans or automated scans on United servers (including scans using tools such as Acunetix, Core Impact or Nessus)

One can hope that the bad guys are similarly polite. And, as you would expect, the United security folks did not see the irony of their restrictions when it was pointed out to them.

CPLX 1 day ago 3 replies      
I've been very confused about the really negative sniping responses to this program saying the miles aren't worth much.

A million frequent flier miles via a major alliance airline is a very, very sweet prize. That's enough for a person to fly themselves and their spouse to basically anywhere on the planet in business class five times, round trip.

jvehent 1 day ago 1 reply      
I guess creating a bug bounty program is an easier way of pretenting that you care than actually fixing your broken TLS...

 $ ./cipherscan united.com prio ciphersuite protocols pfs curves 1 RC4-SHA TLSv1,TLSv1.1,TLSv1.2 None None

adventured 1 day ago 2 replies      
"The cost can be less than hiring outside consultancies."

It's probably ten times cheaper when you consider the per mile cost to the airline. United could hardly be getting a better deal.

ajays 1 day ago 1 reply      
FTA: United unveiled the approach in May just weeks before technological glitches grounded its entire fleet twice, underscoring the risks that airlines face.

Hmm, makes me wonder: could the glitches have been caused by some "hackers" doing testing?

kendallpark 1 day ago 0 replies      
This is a fantastic idea. A great deal for both parties. Even if you have to fly United. If you're really that adverse to being in the air for a few hours on a United plane you can probably get use the miles for gifts. Get grandma a ticket to visit her grandchildren for her birthday or something.
CyberDildonics 22 hours ago 0 replies      
If they really wanted to thank them they would have given them miles to a different airline.
kriro 1 day ago 3 replies      
Do you have to pay taxes on using the miles?
fapjacks 1 day ago 1 reply      
Phew. Talk about tying two cats together and stuffing them in a pillowcase. That's how I feel about flying United, free miles or no.
nraynaud 1 day ago 0 replies      
that's a lot of broken guitars!
jaybna 1 day ago 1 reply      
What a honeypot. Anyone dumb enough to participate would have to fly United.
droopybuns 1 day ago 2 replies      
Offering miles limits the community of bounty hunters to locations served by united.

Paying out with miles is a fun idea, but the strategy seems fatally flawed to me.

Pyxley: Python Powered Dashboards stitchfix.com
232 points by astrobiased  1 day ago   43 comments top 11
njharman 1 day ago 2 replies      
I've been looking for something to do this (below) in python at least on backend.

Big screen on wall with 6 or so boxes. Each box displaying data which updates in real time. Such as

 - scrolling list of source control commits - graph of busy/idle slaves - single big number, pending builds - graph of open ticket counts - etc
I can't tell if this is one of Pyxley's use cases?

huac 1 day ago 4 replies      
How easy is it to integrate a chart or graph into a larger project? My biggest gripe with Shiny is how difficult is to use the R calculate and graphing functions in a larger project without using OpenCPU as an API.

My guess is that with Python being a more general purpose language, this should be easier..

ivan_ah 1 day ago 2 replies      
How important is flask in this mix? How difficult would it be to back this by a django app for example? I'm asking because the charting would be REALLY useful for me right now, and I already have the django models...
butwhy 19 hours ago 0 replies      
Hmm I have a python script that monitors a few datapoints but don't know how to appropriately save them and plot data on a graph/webpage. I might give this a shot?
sshillo 1 day ago 2 replies      
Will work great until you try to render 50,000 points and your browser crashes because it's build on d3.
m_mueller 20 hours ago 0 replies      
This seems nice, but where did you define the datasource(s) for your examples? Did I miss something? I'm already using Flask, so I'd be interested in using this.
njharman 1 day ago 1 reply      
link to examples is 404, a possibly correct link https://github.com/stitchfix/pyxley/tree/master/examples
jfe 1 day ago 2 replies      
looks great. but i can't help but think it would save everyone a lot of time -- maybe not up-front, but in the long-run -- if we wrote these frameworks in c and just wrote language bindings for r, python, ruby, etc. why are we rebuilding good frameworks over again just because they're not written in our preferred language?
mistermaster 1 day ago 1 reply      
looks cool. shiny is very neat, but has the limitation of having R behind it and debugging ain't fun. I very much look forward to testing pyxley!
rrggrr 23 hours ago 0 replies      
Too many dependencies and layers as compared to a pure python solution like bokeh, or seaborn and flask. I like that dataframe integration.
jqm 22 hours ago 0 replies      
Wow, this is really cool. Playing around w/ US state maps in examples right now. I love stuff like this partially because of the hands on introductions to component technologies I had heard of but not used before.
How to learn data science dataquest.io
244 points by spYa  1 day ago   79 comments top 17
minimaxir 1 day ago 6 replies      
The actual problem with learning "data science" is making inferences and conclusions which do not violate the laws of statistics.

I've seen many submissions to Hacker News and Reddit's /r/dataisbeautiful subreddit where the author goes "look, the analysis supports my conclusion and the R^2 is high, therefore this is a good analysis!" without addressing the assumptions required for those results.

Of course, not everyone has a strong statistical background. Except I've seen YC-funded big data startups and venture capitalists commit the same mistakes, who should really, really know better.

"Data science" is a buzzword that successful only due to obscurity and no one actually caring if the statistics are valid. That's why I've been attempting to open source all my statistical analyses/visualizations, with detailed steps on how to reproduce. (see my recent /r/dataisbeautiful submissions on reddit: https://www.reddit.com/user/minimaxir/submitted/ )

pvnick 1 day ago 6 replies      
Good article for beginners. A couple thoughts, just to build on what the author said:

First off, data science == fancy name for data mining/analysis. Wanted to clear that up due to buzzwordy nature of "data science."

Learn SQL - this is the big one. You must be proficient with SQL to be effective at data science. Whether it's running on an RDBMS or translating to map/reduce (Hive) or DAG (Spark), SQL is invaluable. If you don't know what those acronyms mean yet, don't worry. Just learn SQL.

Learn to communicate insights - I would add here to try some UI techniques. Highcharts, d3.js, these are good libraries for telling your data story. You can also do a ton just with Excel and not need to write any code beyond what you wrote for the mining portion (usually SQL).

I would also go back to basics with regards to statistical techniques. Start with your simple Z Score, this is such an important tool in your data science toolbox. If you're just looking at raw numbers, try to Z-normalize the data and see what happens. You'd be surprised what you can achieve with a high school statistics textbook, Postgres/MySQL (or even Excel!), and a moderate-sized data set. These are powerful enough to answer the majority of your questions, and when they fail then move on to more sexy algorithms.

Edit: one more thing I forgot to mention. After SQL, learn Python. There are a ton of libraries in the python ecosystem that are perfect for data science (numpy, scipy, scikit-learn, etc). It's also one of the top languages used in academic settings. My preferred data science workspace involves Python, IPython Notebook, and Pandas (This book is quite good: http://www.amazon.com/Python-Data-Analysis-Wrangling-IPython...)

stdbrouw 1 day ago 1 reply      
Definitely agree that those long lists that tell you to first become awesome at combinatorics, linear algebra, then learn all about statistical inference (that is, not actual statistical procedures but the mathematical underpinnings of statistics that would enable you to construct and evaluate methods you invent yourself), then move on to stochastic optimization... those are really more about machismo than about actually helping people to learn data science. Sure, linear algebra is helpful, but whether it's fundamental really depends on the kind of data science you're keen to do.

I also generally dislike /r/machinelearning and /r/statistics because they seem to have been taken over by people who will tell you to either get a PhD or get out.But, for me, just learning whatever I thought I needed to help me solve the problem at hand got me stuck really fast. There's so much statistics where you really just have to learn it first before you can start to see when and why you'd like to use it.

It never occurred to me to use hierarchical modeling and partial pooling for a certain set of problems until after I'd read Gelman & Hill. I never thought that inference on a changing process might require different techniques from the techniques for stationary processes until I had to study Hidden Markov Models for an exam. Heck, when I got started with data analysis I didn't even realize that the accuracy of most statistics improves proportional to sqrt(n) and so the next logical step in my mind was always "get more data!" instead of "learn more about statistics!" (If you look at the industry's obsession with unsampled data, data warehouses that store absolutely everything ever and map/reduce, my hunch is I'm not the only one who lacks or at some point lacked elementary statistical knowledge because it just never came up on their self-motivated, self-directed learning path.)

So I think the ideal learning path incorporates a bit of both: learn more about what excites you and about what's immediately useful right now, but also put aside some time to fill out gaps in your knowledge even things that don't immediately look useful and make some time for fundamental/theoretical study.

(x-posted from DataTau)

lessthunk 1 day ago 1 reply      
Data science is a stupid buzzword. The ideal candidate knows enough about IT to massage data, the more the knows about the domain to investigate the better, and for sure some statistics. Most of all always do sanity checks .. does it make sense? Can it be? Is the data correct?

It is an art. Like writing awesome code, etc. practice, practice, and working with experienced people is key.

neovive 1 day ago 1 reply      
"You need something that will motivate you to keep learning." This is so true and often forgotten. I am always learning new things, but the concepts that stick, beyond just the basics, are tied to specific projects or solutions to real problems. I'm typically ok with being a "jack-of-all-trades" for most technologies, just to stay aware of new things. However, when it comes to applying new concepts, skills, or tech to solve problems, a deeper understanding is required; usually obtained through motivation.
washedup 1 day ago 0 replies      
If you want to learn about data science, read this book: http://www-bcf.usc.edu/~gareth/ISL/

I have been going through it and I cannot think of a better resource.

pvaldes 3 hours ago 0 replies      
Learn biology, chemistry and physics, question yourself often and use your instinct.
acbart 1 day ago 0 replies      
As someone who uses "Data Science" to teach "Computational Thinking", I think this blog post hits on a lot of really valuable pedagogoical notes. Getting motivated, learning things through doing, and having a strong context for your learning.

For those wondering why I put my buzzwords in quotes, it's because I don't want to sound like I'm a huge proponent of either of them. CT is the term I use to describe how I teach my students about abstractions, algorithms, and some programming. DS is the term I use to describe how students learn all of that in the context of working with data related to their own majors. I'm not trying to claim some crazy paradigm shift, just that it's a great way to convince students that CS is useful to them.

curiousjorge 1 day ago 2 replies      
if I had some type of practical application that I knew could benefit from data science, like learning RoR to make a marketplace app for example, it would help a lot as I have a clear goal and route to achieve that. However, data science, machine learning, these are so broad, and seemingly complicated (my fear of complicated math formulas and statistics) and worse I don't know what I want to achieve out of it nor do I know what I want to make which really hinders the learning process for me. I need some incentive or reward at the end of the goal.
anderspitman 1 day ago 0 replies      
These principles are useful when learning anything really: human language (immersion), programming (build something), sports (practice), etc.

That said, as someone who worked in software engineering for 5 years without a degree, and recently returned to school, I would say be careful not to discount studying theory at the same time you're practicing your craft. I really think a combined approach of structured university courses and MOOCs, including reading textbooks, along with applying the knowledge has been the best approach for me.

I was arrogant about "not needing" a degree for years, feeling justified by the fact that I was making very valuable contributions as an engineer, until I finally went back to school and realized how valuable theoretical knowledge can be.

gbersac 19 hours ago 0 replies      
I am doing this course and find it really good : https://www.edx.org/course/scalable-machine-learning-uc-berk...

It is about creating a linear and logistic regression + pca using spark (python api).

tyfon 1 day ago 2 replies      
I've been working as an analyst for 7 years, it's only last couple of years I've heard of statistical analysis referred to as data science.

Am I missing something or is it just a new word?

davemel37 1 day ago 1 reply      
Anyone interested in data science should first study cognitive psychology. The CIA has a manual on the psychology of intelligence analysis that is a must read for anyone pursuing any analytical job.

If you dont understand how your mind sees, processes, retains and recalls data...how can you possibly analyze it accurately?

vermontdevil 1 day ago 0 replies      
Also learning R Language and using RStudio is a great way to get into. RStudio has so many packages to help you do any data analysis. The learning curve is quite steep though.
gtrubetskoy 1 day ago 0 replies      
Read this (free) book: http://mmds.org/
searine 1 day ago 1 reply      
Moving data around is just grunt work.

Real science requires a creative and critical mind, which takes years to mold.

graycat 1 day ago 3 replies      
Here are some topics. Are theyconsidered relevant to data science?

Matrix row rank and column rank are equal.

In matrix theory, the polar decomposition.

Each Hermitian matrixhas an orthogonal basis ofeigenvectors.

Weak law of large numbers.

Strong law of large numbers.

The Radon-Nikodym theorem andconditional expectation.

Sample mean and variance are sufficient statistics for independent,identically distributed samples froma univariate Gaussian distribution.

The Neyman-Pearson lemma.

The Cramer-Rao lower bound.

The margingale convergence theorem.

Convergence results ofMarkov chains.

Markov processes in continuous time.

The law of the iterated logarithm.

The Lindeberg-Feller version of thecentral limit theorem.

The normal equations of linearregression analysis.

Non-parametric statistical hypothesistests.

Power spectral estimationof second order, stationarystochastic processes.

Resampling plans.

Unbiased estimation.

Minimum variance estimation.

Maximum likelihood estimation.

Uniform minimum variance unbiased estimation.

Wiener filtering.

Kalman filtering.

Autoregressive moving average (ARMA) processes.

Rank statistics are always sufficient.

Farkas lemma.

Minimum spanning trees on directedgraphs.

The simplex algorithm of linearprogramming.

Column generation in linear programming(Gilmore-Gomory).

The simplex algorithm formin cost capacitated network flows.

conjugate gradients.

The Kuhn-Tucker conditions.

Constraint qualifications for theKuhn-Tucker conditions.

Fourier series.

The Fourier transform.

Hilbert space.

Banach space.

Quasi-Newton iteration and updates, e.g., Broyden-Fletcher-Goldfarb-Shanno.

Orthogonal polynomials for numerically stable polynomialcurve fitting.

Lagrange multipliers.

The Pontryagin maximum principle.

Quadratic programming.

Convex programming.

Multi-objective programming.

Integer linear programming.

Deterministic dynamic programming.

Stochastic dynamic programming.

The linear-quadratic-Gaussiancase of dynamic programming.

Legends in D3 susielu.com
198 points by jsweojtj  20 hours ago   20 comments top 6
bane 13 hours ago 3 replies      
A team I work with built a great internal tool with D3 and there were a few things that they did that really bumped it up to the next level:

- The data was the interface. Things that they used d3 to show, were also the interaction elements which could be clicked to focus-in on subsets of the same data. It was almost fractal, but also immediately clear and simple to use. There were no control bars or sliders or really buttons. Anybody who could use google maps could use this and it took about 10 seconds to show somebody how it worked before they could operate it at a high-level.

- The legend was not just a visual aid, but had interactive components in the tool:

1) Hovering over an element in the legend caused all the corresponding elements in the display to highlight so you could see where they were in a complex display. (and visa-versa, hovering over an element in the main display caused that element in the legend to highlight)

2) Clicking an element in the legend caused it to filter on-off, allowing various unimportant parts of the display to be eliminated.

These three things changed it from just a static picture into a useful analytic tool and getting more value out of the legend by turning it from just a picture into a fully interactive element felt so obvious in retrospect that legends I see now that are just a legend feel incomplete.

stared 16 hours ago 4 replies      
Very nice! I was tired of making my own legends for each single things. And this one looks really nice and easy.

Though, one of my pet peeves: people, please don't use discrete color scales for continuous parameters. It distorts presentation for no good reason. For example two countries are colored the same way even though they are on the opposing edges of a bin (so the difference is being masked), or two countries are colored in a different way, even though the difference is minimal, but just passes an artificial threshold (then an artificial difference is being shown).

aubergene 15 hours ago 0 replies      
Looks very good.

For the size legend, the defaults are a bit odd. You should almost always be using d3.scale.sqrt() as you're comparing area, also zero in the domain should usually map to zero on the range.

I made a similar legend for circle areas, but they are stacked within each other. http://bl.ocks.org/aubergene/4723857

mslev 9 hours ago 1 reply      
I can't be the only one who thought this article would be about people who will be known throughout history for their amazing work in D3- champions, heroes, front-end developers!
geoffharcourt 12 hours ago 0 replies      
This is awesome. After spending a lot of time tweaking chart legend code, I would be very happy to pull a polished toolset like this in to future projects.
couchand 13 hours ago 2 replies      
"Tired of making legends for your data visualizations?"

No. It never takes more than about two minutes to whip up something basic like this.

Integrating your legend tightly with your visualization has lots of benefits, mostly in terms of enabling interaction.

GCC 5.2 released gnu.org
184 points by RoboSeldon  1 day ago   33 comments top 8
agwa 1 day ago 2 replies      
It's worth noting that, due to GCC's new and esoteric version numbering scheme[1], this is a bugfix-only release on the 5.x branch. The bigger news was when GCC 5.1 (a major stable release) was released in April. That said, GCC 5.2 fixes a pretty serious bug that I was hitting so I'm happy to hear about it!

[1] https://gcc.gnu.org/develop.html#num_scheme

lifeisstillgood 1 day ago 0 replies      
It's worth celebrating (well, noting in an approving manner) this release. GCC is a foundation stone project for, well, a huge amount of the world's wealth.

It is a healthy project (even with CLang etc), and with the heartbleed post mortem on our minds we should probably ask what can be done to keep it and others healthy - auditing, foundation status etc.

So thank you GCC developers, for making sure I do not need to worry about something.

the8472 1 day ago 4 replies      
> Write-only variables are now detected and optimized out.

That's probably something one should keep in mind when writing micro-benchmarks.

Meai 1 day ago 2 replies      
Does GCC support modules? http://clang.llvm.org/docs/Modules.html
ademarre 1 day ago 2 replies      
With this release the golang frontend catches up it'so Go version 1.4.2.

EDIT: I'm wrong. That was GCC 5.1.

tosseraccount 1 day ago 2 replies      
Any benchmarks?How does this compare with previous versions of gcc?
minot 1 day ago 1 reply      
Hi guys, any idea when we might get this on mingw?
anon3_ 1 day ago 0 replies      
Where is GCC's place in the world now that there is LLVM/Clang?

Does the LLVM+Clang combo make GCC obsolete?

Three Dead Protocols annharter.com
192 points by englishm  2 days ago   75 comments top 23
userbinator 1 day ago 1 reply      
I think trivial protocols like this are a good thing to start with for educational purposes, because implementing one correctly does require quite a bit of effort for someone who has had no experience with networking or RFCs.

Even for something as simple as QOTD the implementer has to consider things like message lengths and interpret terms like "should" (a recommendation, not an obligatory condition for compliance.) Observe that the standard also doesn't mandate that the message must change only once per day, so the implementation presented is compliant. :-)

For TCP Echo, because TCP is a stream-oriented protocol and AFAIK since you can't actually send and receive simultaneously in code - it's always read or write - the question of how much to echo back, and after how long, is also something to consider. Theoretically, an echo server could wait to send until several GB of data were received or the connection is closed, buffering the data limitlessly, and still be compliant. This also shows the importance of being clear and precise when writing standards or protocol specifications in general, should you ever need to do so.

linuxlizard 1 day ago 0 replies      
Late 90's I did firmware for print servers. The echo server was pretty important to us for testing our hand-rolled TCP/IP stack.

Print server management was done through a Telnet interface. We also supported LPD which was one of the stupider protocols ever to see the light of day.

I added a QOTD service to the firmware as an easter egg.

I'm going to go soak my teeth now.

Animats 1 day ago 5 replies      
As I mentioned when someone brought up the history of UDP, the original idea was that datagram protocols would be implemented at the IP level, as seen here. UDP offers the same functionality, but one level higher. In BSD, it was easier to do things from user space at the UDP level rather than at the IP level, and adding new protocols directly above IP fell out of favor.

Try to get an IP packet that's not TCP, UDP, or ICMP through a consumer level Internet provider.

achillean 1 day ago 3 replies      
These protocols may be deprecated, they may be unused and they may be out of sight but they aren't completely dead yet:


Many of these old protocols don't die easily and tend to linger around forever. Maybe there's a nostalgic element to keeping them alive for sysadmins :)

placeybordeaux 1 day ago 3 replies      
Given the definition[1] of the echo protocol works on UDP you could potentially spoof the address to be coming from another echo server and have packets going back and forth indefiniately, correct?


ajslater 2 hours ago 0 replies      
No mention of finger, port 79.


emmab 1 day ago 2 replies      
I think your implementation of "RFC 862, the Echo Protocol" wouldn't work if the input doesn't end in a newline.
TheLoneWolfling 1 day ago 1 reply      
This actually brings up an annoyance with FF (well, Pale Moon, but same difference). If you try to open, say, pchs.co:17 with FF, it'll pop up a prompt saying "this address is restricted" - with no way of overriding it.

You have to go into the config and add a key (!) to actually be able to access it. And worse, there's no way I've seen to actually just straight disable the "feature". You have to add an individual port, or a range of ports, or a comma-separated list of ports or ranges.

(For those wondering, it's "network.security.ports.banned.override", with a value of a port, or range, or comma-separated list of ports or ranges. For example: "7,13,17".)

Once you do, it works fine.

johnwfinigan 1 day ago 0 replies      
I have actually used daytime for a "real" use: as a quick and dirty way of eliminating the possibility of guest clock drift when running benchmark scripts inside of emulated guests with unreliable timekeeping. Obviously a bad idea for benchmarks measured on the order of seconds, but probably fine for benchmarks running for hours. ntpdate -q would probably work just as well though.
zx2c4 1 day ago 1 reply      
I've been running a QOTD service on my server for the last few years:

 $ nc zx2c4.com 17
Source here: http://git.zx2c4.com/mulder-listen-daemon/tree/mulderd.c

I also run a toy telnet server:

 $ telnet zx2c4.com

batou 1 day ago 1 reply      
I spoke to someone a few years ago who has an asymmetrical transit cost agreement between two companies. He joked that it may have been lucrative to just pipe /dev/random to their echo port 24/7.

I suspect that is one of the many reasons that is a dead protocol.

foliveira 1 day ago 0 replies      
Nice little exercise. Just implemented the three servers in Node.js over lunch time.

[1] https://github.com/foliveira/echo-is-not-dead

[2] https://github.com/foliveira/qotd-is-not-dead

[3] https://github.com/foliveira/daytime-is-not-dead

rumcajz 1 day ago 1 reply      
Don't frget about TCPMUX listening on port 1. (RFC 1078)That's a serious stuff that could see many applications even in today's world.
skrebbel 1 day ago 1 reply      
Wait, did she just start an infinite number of threads in a loop, or is ruby awesome in ways I didn't know?
anotherevan 1 day ago 0 replies      
Hmmm, I run Q4TD[1] and now Im thinking I should implement my own QOTD service

I wonder if I could do that with Google App Engine talking to the blog and just picking random posts.

[1] http://q4td.blogspot.com/http://www.twitter.com/q4tdhttps://plus.google.com/u/0/110672212432591877153/postshttp://www.facebook.com/quote4theday

chrismorgan 1 day ago 0 replies      
RFC 2616 has been superseded by RFC 7230 et al.
imauld 1 day ago 0 replies      
This isnt about the protocol, but you should know my code for this is really sloppy because it was my first time attempting to use vim and literally everything was hard.

Ahh, Vim. It makes me happy to know that more seasoned developers than myself have issues with it as well.

dec0dedab0de 1 day ago 0 replies      
Every time I look down the well known port numbers I imagine setting up a box with every protocol running.

A bit of an aside, how many people still use plain netcat? I switched to ncat years ago, and haven't looked back.

kijin 1 day ago 4 replies      
Pretty much every port below 1024 is reserved for one protocol or another, but many of them have been obsolete for years. It seems that whoever was in charge of assigning well-known ports back then just handed them out like candy.

Well, who am I kidding? This is the same IANA that used to hand out humongous blocks of IPv4 addresses to anyone who asked.

Should we try to deprecate dead protocols so that low ports can be put into better use? Or have we come to expect that all new technologies will simply reuse ports 80 & 443, so we have no need to set aside new well-known ports anymore?

kijin 1 day ago 2 replies      
> May 1983 [footnote] Fwiw, RFC 2616, for HTTP, was published the same month, so at least some people were doing actual work in those days.

RFC 2616 was published in June 1999.

I don't know what Sir Tim was doing in May 1983, but I'm pretty sure he wasn't writing an RFC for a protocol that he wouldn't invent for six more years.


vhost- 1 day ago 1 reply      
I suspect a few more implementations of these are going to spin up. I just did the qotd in Go: https://github.com/kyleterry/qotd
mml 1 day ago 0 replies      
someone should tell her about the fortune file :(
dilap 1 day ago 1 reply      
The QOTD seems to just hang sometimes. Anyone have any guesses as to why?
NASAs New Horizons Discovers Frozen Plains in the Heart of Plutos Heart nasa.gov
187 points by Thorondor  8 hours ago   80 comments top 8
leemac 7 hours ago 2 replies      
Seems like this entire Pluto mission has opened up quite a few questions for Geologists. For such a small world so far from the sun, it sure has some very interesting features and characteristics. The recent mountain range photo/3D map was incredible.

We live in some exciting times. Every few months we have a new probe somewhere teaching us so much about our tiny corner of the universe.

ashhimself 5 minutes ago 0 replies      
Serious question; how would they know it's no more than 100million years old. I consider myself a some what smart guy but this is... beyond me :
aklein 6 hours ago 6 replies      
The numerical accuracy and calculations needed for getting the spacecraft so close to Pluto must be pretty awesome. Does anyone know what the precision is on calculations like these?

Also, anyone know why the spacecraft has to do a flyby, as opposed to, say, going into orbit around Pluto? Is it because the fuel involved in slowing down the spacecraft would be forbiddingly heavy?

atorralb 42 minutes ago 0 replies      
an image search and it comes out alot of old concrete walls... wtf? https://www.google.com/search?tbm=isch&imgil=5aN7IeRgNSROxM%...
BurningFrog 7 hours ago 2 replies      

I expected just another dusty cratered rock.

yk 4 hours ago 0 replies      
If anybody else has the problem that the first and last word of each line is cropped, just disable the ".after-body, .article-body" css. This brings back the scroll bars.
mkoryak 7 hours ago 6 replies      
"Data from New Horizons will continue to fuel discovery for years to come.

why will it take years?

ajays 6 hours ago 2 replies      
Where did the water come from on Pluto?
Work for a remote culture higginsninja.net
172 points by ScottWRobinson  2 days ago   138 comments top 22
jasode 2 days ago 15 replies      
>Companies that support remote workers and do it well seem to have a huge leg up on the competition.

I'm a remote worker and I'm definitely spoiled by not having to deal with open office plans and disruptions.

That said, I see no evidence that the article's statement is true. (I want it to be true, but that doesn't change the fact that so far, I see zero evidence that it is true.)

You can't convince people with just rhetoric.

Instead, show compelling examples of how Company-A-with-mostly-remote-workers is beating the pants off of the Company-B-with-onsite-workers.

Show that RemoteWorkerCompany is 10x more innovative, delivers 10x faster, has 10x profits, etc etc.

I suspect finding comparisons that control for other variables are hard to come up with. Nevertheless, that's what it's going to take to convince managers. A bunch of programmers writing a thousand essays on the "advantages" of remote workers is just preaching to people like me who happen to like working from home.

pcmaffey 2 days ago 0 replies      
I've been a remote worker for almost 10 years. In fact, I hardly know what working in an office is like.

I can say that remote working is not without its own set of challenges. Chief among them is the need for EXCELLENT communication skills. I've worked with supremely talented and previously successful teams that simply fall apart in a remote environment, because they don't understand the value of transparent communication. Working remotely, everyone relies almost entirely on the word of their coworkers, as its much more difficult for leaders/managers to get a feeling on their team.

Like any successful team, it comes down to establishing trust. But with remote teams, there's much less margin for error.

andrewljohnson 2 days ago 3 replies      
We've done both (office in Berkeley, but now two founders in Berkeley, and the other 7 or so around the US).

A couple of points:

a) It should be all remote, or all local. You ruin the fun for people if a chunk of the team if talking, and chat is dead.

b) My co-founder/wife and I purposefully avoid shop talk, and keep it all in chat.

c) Remote is a good life style for some people. We look for people who will be happy at home, people with lives and families, who don't need to meet friends to go out with and lovers at work. This fits our culture - we attract outdoors types, who like time to themselves.

d) Some people are self-driven and make good remote workers. Some people need to be in an office and get constant nudging from physical social pressure (and their boss).

e) Tools: weekly Google Hangout, frequent one-on-ones, Slack, Trello, Github - almost zero email.

discardorama 2 days ago 1 reply      
I worked for a large company that had a somewhat flexible policy (if your manager allowed it, you could). I found that on the days I was working from home, I could get a lot more done. I was more relaxed, and highly productive. I didn't have to go through the morning ritual of chit-chat, coffee, etc. etc.

Having said that: I also saw how it was being abused. There were some colleagues who were working side gigs while working remotely. If it's a highly specialized area, then it's hard for a manager to make sure you're pulling your weight. There were some stellar remote workers; and there were some total slackers. Just like with anything else in life, it's a mixed bag.

omouse 2 days ago 6 replies      
I find that a remote culture is hard to build if half your office is remote and the other half is in the office.
TeamMCS 2 days ago 1 reply      
I love this idea and very much agree you end up with a much more modular, and proactive organisation with the right people. Owing to desk space issues, finance is moving more towards working-from-home. Honestly, in my next job I'd like to go 100% WFH - the amount of time saved from commuting is huge and, from an organisational perspective, if I get inspired over the weekend or the evening I'm much more likely just to get some bits and pieces done.

Unfortunately there's still a lot of legacy managers and people who make the transition tricky. If you can start from the ground up, or at least foster it within teams that's great.

gizi 2 days ago 0 replies      
I've worked remotely for the last 10 years now. Existing businesses will probably stick to their existing practices until they die, while new startups which don't have too much baggage to drag around, may indeed see that the future is heavily biased in favour of working remote:

- There is a higher caliber of workers worldwide than just next-doors (if the company is truly capable of distinguishing)

- Some workers may conceivably even be cheaper (cheaper countries, but don't count too much on that for high calibre workers)

- Part of their salary is that they get a huge portion of their life back (this has real cash value)

- The tooling and overall technical management tends to be better, simply because it has to

- Especially in an international context, both employer and employee avoid an impressive number of government regulations, including visas, work permits and so on.

Even though an entirely remote company is more productive, I don't believe that existing on-site companies would be able to introduce it. I see them rather continuing to outsource while shrinking their head count year after year.

pmikesell 2 days ago 1 reply      
It doesn't have to be 100% one way or the other. I think the optimal solution might be something like 1 (or maybe 2) days per week in the office for planning and sync up, and then 3 days WFH.

I'm sure it depends on the size or the organization and type of work being done (even depending on the type of software being written), but it's something that could be experimentally figure out.

zkhalique 2 days ago 9 replies      
I happen to agree. As an employer however, we've found it hard to always keep our developers motivated and executing on time. Any advice?
scottndecker 2 days ago 2 replies      
The first four points are summarized in "Email is better than verbal communication." While email definitely has advantages, there are tons of downsides of having to rely on written communication. It's slower, you don't get the non-verbals, it's easy to misinterpret, you form less of a bond with the person on the other end, etc.

The last three points are much better (though I can't attest if they're true or not).

andrewtbham 2 days ago 0 replies      
My friend tony, that owns a startup called fleetio, made a compelling presentation about remote work and it's advantages. It also has lots of info on tools.


pselbert 2 days ago 0 replies      
I've been partially remote for the past 5 years, averaging 2-3 days a week in the office. During that time I've also had stints of being fully remote while overseas for a few months at a time. The fully remote time is vastly more productive, but could be very difficult regarding communication. The work environment wasn't designed to be remote-first, and this puts a strain on anybody who isn't in the office.

In my experience remote all-the-way works wonderfully. Remote partially suffers around communication when the rest of the team doesn't emphasize asynchronous tools like chat/comments/email.

robohamburger 2 days ago 0 replies      
As person who has spent half their career telecommuting and half in an open floor plan: it seems mostly like a series of trade offs.

Being remote forces you to be more deliberate with communication. I tend to make more design docs and proposals than I think I otherwise would.

Working in an open floor plan means you can randomly overhear things you are either passionate about or have expertise in and jump in.

I suspect there is also an emotional and introvert/extrovert bit as well. Sometimes it is nice to be physically be around people working industriously on your same problem domain.

49531 2 days ago 0 replies      
I recently switched from a non-remote team to a remote one, and I can definitely echo these sentiments.

The biggest difference I have seen is that merit seems to give more weight to a team member's position than politics.

grumps 2 days ago 0 replies      
They're are many downfalls with working remote. In the past year of being remote I've found that it can be very difficult to pick up on the attitude of a co-worker or boss. It also can make difficult conversations not go so well. People will also avoid having to have face to face conversations because they don't like confrontation. I like working remote, but I also think it's important that a "connection" be built with your co-workers and sometimes that just only partially happens over chat and hang outs.
varelse 2 days ago 0 replies      
I find that in the age of the open office, working remotely is the only time I'm really productive. I didn't mind working from an 8x8 cube, but my current tiny desk in a maze of desks, all alike, is the pits (cue some nimrod posturing about how this enhances agile/availability/WTFever).

That said, I totally grok people who have a large social component to their dayjob seeing these things as a perk. I am not one of those people. I write code. It it isn't truly important, GO AWAY...

sulam 2 days ago 0 replies      
The one quantitative thing I think you can say about companies with lots of remote workers is that they don't have to pay as much for engineers. Speaking as someone who lives in the Bay Area, that makes me less likely to work for such a company. Yes, I could move somewhere where the cost of living is lower, but I actually like it here, and I have yet to run across a company with a strong remote culture that doesn't use it as a cost-savings device (among other things).
dom96 2 days ago 2 replies      
As someone who would be interested in working remotely, I have to ask: which companies would you say have a good remote culture?
Sawbones 2 days ago 2 replies      
I would love to work for a remote company. I had partial remote work for awhile and wanted to just go full remote.
dataker 2 days ago 0 replies      
That's a broad statement: time-zones might make communication a lot harder and/or cultural differences(for other countries).

Working in the U.S with someone from Australia or Asia is a lot harder than with someone down in Mexico or Canada.

baby 2 days ago 0 replies      
Curious about what the people at buffer think of this
copsarebastards 2 days ago 4 replies      
Look, I get that working remotely is good for many workers, but let's admit that's why we're doing it instead of pretending it's about "getting a leg up on competition".

> They prioritize communication and collaboration by necessity

This is exactly what remote culture doesnt do. A remote culture inherently has decided that being off-site is more important to them than being able to communicate and collaborate quickly and easily.

> It is easy to reach out for help

Easier than standing up and asking for help over the cubicle wall? No.

I can see how the increased difficulty of communication could be arguably a better thing, but lets not pretend its easier.

> There is a lot less wasted communication

This is true, but its balanced by the fact that the non-wasted communication costs a whole lot more. And I think its also an overstated claim: just because you can refer back to your communication doesnt mean anyone actually can find the communications or actually does go back and refer to it.

> There is a lot less posturing

I can see how this is annoying, but Im not sure how this relates to a claim of productivity. In the end, Im not sure presenting ideas with logic is actually the best way to do things: often Id rather just do what the best doers on the team suggest, with or without justification. The best plan is the one that gets the job done quickly and with quality, even if its one thats hard to justify logically beforehand. Requiring logical justification for things means that you prioritize the talk of people who talk well over the intuition of people who are actually good at their jobs. Its worth noting that posturing tends to be most effective for people who are respected by their teammates, which tends to correlate with being good at your job in technical fields. Posturing evolved as a behavior for a reason, and Id not be so quick to discard it.

Also, theres an apparent contradiction between "Even if we are over-communicating, it is okay, because we aren't forcing a squadron of employees to sit in a meeting room pretending to be interested. and "I have to believe that it has something to do with the fact that most of the effective communication is either written, or is done in large meetings where lots of people are watching. Either your meetings are larger or they arent; dont pick whichever fits the point youre trying to make.

> There is often a higher caliber of workers

Uh, okay. If were allowed to make vague unjustified claims, I guess I can just counter with, Remote work tends to lead to unhealthier workers because they dont go outside.

> You get a huge portion of your life back

This is also a pretty unjustified claim. When I worked from home I found that it was a lot harder to turn off when my workday was done.

Citing commute times is a good justification for living near where you work, not necessarily for working remotely. Im a 25 minute train ride from my work, and having an excuse to get up and go out is pretty good for me in general.

LAX becomes largest U.S. airport to allow Uber, Lyft pickups latimes.com
169 points by denzil_correa  1 day ago   119 comments top 13
SovietDissident 1 day ago 15 replies      
I used to live in Playa del Rey, which is maybe 10 minutes from LAX. There's a minimum fare for cabs of something like $17, which I was forced to pay. But the worst part was that every freaking time the cab driver would give me attitude for the short fare (ostensibly because they have to wait in that long line). What do you want me to do? Walk an hour home carrying my luggage? Uber and Lyft are a complete blessing.

Despite all these taxi regulations which are supposedly there to protect the consumer, all they did was create an entrenched oligopoly, where taxi companies were complacent because they basically lacked competition and didn't have to increase the quality of their service. But now that people have an alternative and are eschewing cab services in droves, they are crying bloody murder. Stop blaming the consumer, lobby to get rid of the medallion/regulatory model, and get ready to finally compete (or perish)!

ransom1538 22 hours ago 3 replies      
My wife had a girls trip in Chicago. She made the error of just getting a "cab" and not using her UBER app. It turns out, what she got into was some kind of cab that didn't have a till. The driver was angry she didn't have cash, drove her around to an ATM, demanded $100, refused to let her have her items out of the trunk. Personally, I am no longer comfortable with her getting into cabs or non-uber black cars - it is becoming unsafe. If that driver decided to do something -- he would be off the grid.
jedberg 1 day ago 3 replies      
It was already allowed. I took an Uber from LAX a few weeks ago.

The catch was that they had to buy a $5 "temporary taxi license" (that Uber paid for) when they came in for each ride, which delayed them 5 minutes. Now they won't have to stop, which is nice.

xasos 22 hours ago 0 replies      
I'm assuming that LAX is the largest U.S. airport to allow regular Uber pickups?

You can request an UberTaxi or Uber Black car at ORD [0], which is the second largest airport in terms of traffic in the world [1].

[0] https://www.facebook.com/uber/posts/601574873216135[1] http://www.chicagobusiness.com/article/20140924/BLOGS02/1409...

mayneack 1 day ago 1 reply      
> To work at the airport, drivers cannot have convictions for reckless driving, hit and run, driving under the influence, sexual crimes or terrorism

I get that some of these are sort of taxi or driver specific and that the rules are a little odd anyway, but it seems that murder should be on this list too?

pbreit 1 day ago 4 replies      
What are the main, reasonable arguments for disallowing paid airport pickups? Safety? Traffic? Control?
unknownzero 1 day ago 0 replies      
Somewhat off-topic, it's interesting the article lists LAX as the third busiest airport, when I googled to see what the first two were out of curiosity it was shown right in the search results as the second busiest. That data on google appears to come from the wikipedia article: https://en.wikipedia.org/wiki/List_of_the_busiest_airports_i... for "Busiest US airports by total passenger boardings". LAX then appears to be listed as fourth busiest in "30 busiest US airports by total passenger traffic". These numbers obviously vary by year and it seems the one google picked up was from preliminary FAA data. No idea why this stuck out to me but for some reason I want to know why the author settled on third busiest here out of the myriad choices.
unabridged 1 day ago 2 replies      
How can they tell the difference between an Uber driver and a friend I called to pick me up?
ChrisNorstrom 1 day ago 3 replies      
How to get around Uber Airport bans: Take a taxi ride just outside the airport and from there get an Uber ride. It's still much cheaper.

Perfect Example: Orlando International Airport, is a jail. It has no sidewalks leading in or out of the airport. It's about 1 mile of 60mph road and ramps leading in and out of the airport with NO shoulders and NO sidewalks. Look it up on Google Maps. It's impossible to leave. After 11pm I arrive and the buses are not working, friend can't pick me up. A 6 mile taxi ride is $55 dollars! Uber ride is about $12. I try to hop on the Parking Spot / Rental car shuttles to get just outside the airport ban radius but they've caught on and no one will let me on. I'm not paying $55.

So I just grabbed a taxi and had him drop me off just outside the airport at a Denny's. Cost: $13 for 1 mile. Grabbed an Uber for the remainder of the trip which was $11. Paid $24 instead of $55.

mhartl 20 hours ago 0 replies      
I usually Uber to the airport and taxi back. The latter costs almost 50% more. If this policy goes through, it will be a welcome change.
DrScump 1 day ago 2 replies      
it will be short-lived unless they overturn this ruling:http://www.latimes.com/business/la-fi-uber-suspended-2015071...
grapevines 23 hours ago 0 replies      
What are the chances that somebody would do LAX to Santa Barbara ?
kkt262 21 hours ago 0 replies      
Awesome. Being in LA this is really good news.
Greuler Graph theory visualizations maurizzzio.github.io
169 points by shawndumas  1 day ago   23 comments top 12
degenerate 1 day ago 1 reply      
Whenever I see a new graph visualization tool, I instantly start dragging the nodes around like a madman to see how well the graph "balances" in the viewport. I soon noticed there was a tutorial was running while I was manipulating the position of the nodes, and regardless of how much stretching and pulling I did to try breaking the tutorial, it worked flawlessly as if I wasn't even touching it. That, I thought, was really awesome. Well done!
cschmidt 1 day ago 2 replies      
Looks nice. I do find the headlines drawing themselves a letter at a time to be kind of distracting. I makes it hard to scan down the page, as the headlines aren't there.
dikaiosune 1 day ago 1 reply      
Very cool! I was lazily working on something similar for a project, and might be able to drop this in and save a ton of time.

If the author reads this, might I suggest some examples using the label attributes of nodes and edges? It'd be great to see a graph annotated (potentially with some image or link tags embedded for further browsing?).

leni536 18 hours ago 0 replies      
Really nice. One nitpick: I can drag the graphs out of the svg frame and then they are lost forever. It would be nice if there was an attraction toward the center of the frame or there would be walls that wouldn't let this happen.
pmall 17 hours ago 0 replies      
How many nodes/edges can be handled by this ? I'm working with graphs with hundreds of node/edges.
fait 1 day ago 2 replies      
Cool. Any way I could connect my Neo4j API and visualize the db or individual requests?
alexisnorman 1 day ago 0 replies      
Awesome! I'm currently piecing together teaching material for an intro to algorithms class and this is excellent. Right about to implement Bellman-Ford's and Dijkstra's.
TheGrassyKnoll 1 day ago 1 reply      
I counted six cycles ?

 6,7,8,6 8,0,3,4,9,8 0,5,1,0 0,1,3,0 (are these not considered separate cycles ?) 8,0,5,1,3,4,9,8 0,5,1,3,0
Anyway, very nice presentation.

lotophage 1 day ago 1 reply      
This is an entirely personal opinion, but as nice as it is, I find the font painfully difficult to read.
compostor42 1 day ago 0 replies      
Wow, this is slick!

Would love to use this to visualize a min/max algorithm.

A nice way to demonstrate graph algorithms, as you have already shown!

maurizzzio 1 day ago 1 reply      
thanks for the submission OP!
eridal 1 day ago 1 reply      
love it!!

I am looking for a tool to construct a PERT visualization, but it require to be a activity-on-node type of graph

does Greuler allows to construct such type of graphs?

An Identity Thief Explains the Art of Emptying Your Bank Account bloomberg.com
181 points by ressmox  2 days ago   62 comments top 7
ScottBurson 1 day ago 8 replies      
Wait, I don't get this. The Amex agent called the old phone number on the account. The person who answered gave some indication of being the account owner, but didn't answer quite as many questions correctly as the thief.

So what scenario is the agent hypothesizing? The person at the old number was actually the identity thief, and used the account for maybe several years without any challenge, before the actual owner changed the number back? That makes not the slightest bit of sense to me.

I think if the phone number has recently been changed, and you call the old number, and the person who answers can answer any question at all about themselves, you have to figure that's the account owner. Who else could it be???

clamprecht 1 day ago 3 replies      
The best part of the story is how he hacked the US immigration system to be able to stay in the US after serving his time:

> Factoring in time served and a reduction for good behavior, Naskovets got out in September 2012. He faced a deportation order that would have sent him back to Belarus. Representing himself in immigration court, he argued that he risked torture if sent home, based on his run-ins with the KGB. As a signatory to the U.N. Convention Against Torture, the U.S. cannot send someone back to a country knowing hes likely to be tortured. An immigration judge sided with Naskovets. The government appealed.Heres where Naskovetss optimism proved justified. While he was buffing floors in a county prison in Pennsylvania, his case had caught the attention of Stephen Yale-Loehr, a law professor who runs an immigration clinic at Cornell. With the help of Yale-Loehr and his students, Naskovets fought Immigration and Customs Enforcement in court for two yearsand in October 2014 the agency decided to let him stay.

kriro 1 day ago 1 reply      
Commit crimes, get relatively short term sentence and pay 200$ fine, stay in the U.S.Crime does pay. Sucks pretty hard, identity theft is really nasty if it happens to you.

That being said while slightly exaggerated the claim of torture in Belarus isn't far fetched. Dude in charge is pretty much a crazy dictator.I remember during the (last?) elections his main opponent was mysteriously beaten up and he said in an interview that he shouldn't whine about it like a little girl.

p.s.: How do these arrests happen, is interpol involved or can the FBI negotiate with the Czech government and just roll in there?

emir_ 1 day ago 1 reply      
Anyone have any idea whether this could have been prevented if banks in the US required PIN to process a transaction? Would fraudulent transactions go down significantly if stolen cards couldn't be used without PINs?
manishsharan 1 day ago 3 replies      
Quick question : would you pay extra fees for a credit card that only allows transaction from whitelisted stores and if used online, the shipping address could be only to a whitelisted address ?
mettamage 1 day ago 2 replies      
Why don't credit cards have a secret password like a pin code? I find it strange that all the security information is available on the card itself.
theumask 1 day ago 1 reply      
I didn't know that is the new way to get residence in the US...Hey, FBI guys, next time maybe you send thieves like this one to spend some time behind bars in their home country, so they will truly get what they deserve.
Users Rationally Rejecting Security Advice (2009) schneier.com
155 points by anacleto  2 days ago   91 comments top 14
ghshephard 2 days ago 6 replies      
I like the approach they take in Singapore - take the default posture that users will probably not be security aware and will also reject your advice.

Want to do login to your bank account? 100% required that you have an two-factor SMS token in addition to your user ID and password.

Want to do bill pay to a new-payer? Not only do you need to have your two-factor SMS token to first login, and then make the payment, you also need the physical token they sent you to do a crypto-sign of the bill payer account information before you can add the new bill-payer.

Coming from the United States, I'm blown away how much more secure (and convenient - Love bank-bank transfers, haven't used a paper cheque in 2+ years) banking is in Singapore. Probably suggests why paypal probably took off faster in the USA then here as well.

cantrevealname 2 days ago 3 replies      
Applied to a population, the argument makes sense:

100 million users spent 1 minute/day verifying URLs --> cost of $33M of lost productivity (assume wage of $20/h) --> avoids 10,000 successful phishing attacks (.01% of population) --> saves $500K (each victim loses $500) --> not worth following security advice (since $33M is far greater than $500K)

Applied at an individual level, the argument makes less sense:

1 user (i.e., me) spends 1 minute/day verifying URLs --> cost of $0.33 of lost productivity --> avoid .01% chance of phishing attack --> avoid .01% chance of loss of $500 --> but in the event I do get phished, my loss is $500 + WEEKS of hassle with banks, credit reporting agencies, etc, to clean up the mess!

This is like the antibiotics trade-off. We don't want the population to overuse antibiotics to avoid building resistance in the population. But if I'm sick, and there's only a 10% chance that the antibiotic is useful (and 90% chance that my illness is viral and therefore the antibiotic is useless but otherwise harmless), then it's still in my individual interest to take it.

danibx 2 days ago 2 replies      
I treat password strength relative to the importance I give the service I'm using. If it is something I care about I will use a 8-12 character password with a few uppercase letters and digits. If it is something I don't care about, but requires an account, "1234" should be enough.

I have even given up on registering on a few sites because they required a safe password. This is getting even more common to me with mobile apps. Typing long passwords on a small tocuh screen keyboard is difficult.

fluidcruft 2 days ago 0 replies      
I think that rationally it's likely even worse now (since 2009) in the sense that these massive data breaches keep happening and it has absolutely nothing to do with our own personal security behavior. It doesn't matter how careful we are with our security, it's going into the hands of the baddies anyway if they want it.
zyxley 2 days ago 1 reply      
As someone who uses 1Password for everything, the one thing that bothers me most is when passwords are limited to specific characters or to painfully short lengths. What the heck?
Nash921 1 day ago 0 replies      
That's a weird treatment of rationality.

If users reject security advice because they studied the costs and benefits, and found it unprofitable given the risks, then that's rational.

But if users reject security advice because "oh God it's too hard and it's probably not too bad anyway I have nothing to hide, right?", that's not rational. That's ignoring the problem and coincidentally getting the right answer.

It's like concluding users are rational for refraining from buying a lottery ticket. It turns out, though, the users didn't actually do any math, and were just too lazy to get up that morning.

istvan__ 2 days ago 3 replies      
Instead of giving them an advice, we who understand how it works should make these things defaults and not let them exposed. What can the users do in a world where banks are asking you to read the CC details loudly in a phone conversation and give them all the details over the phone. Next thing is that there is a fake call from a criminal organization pretending to be the bank. How would a user detect that it is fake? I think security should be about rules and enforced practices rather than advices that they can happily ignore.
DanielBMarkham 2 days ago 0 replies      
It's worse than that. Since modern systems are multi-layered and many of the layers are not even administered by the user, even users that followed advice given are vulnerable to loss, so for folks trying to make an economic trade-off in terms of time and hassle, it's all just a crap shoot. Do some stuff that you feel might be reasonable, like install Norton or something, come up with a password that includes both your name and your ssn, then wear gloves when you click on pron sites.

It's really quite ludicrous the situation we put the average user in. There are folks who spend hundreds of hours worrying about security and still get taken to the cleaners. What chance does Joe Sixpack really have?

recursive 2 days ago 0 replies      
I see roughly 1 certificate error per week browsing the web "in the wild". I've learned how to click through. I barely consciously notice them anymore.
ccvannorman 2 days ago 0 replies      
As a user I go through several security hoops per day, and I'm damned tired of it.. So many errors all the time..

stuffs money in mattress

stuffs facebook profile in there too (printout)

api 2 days ago 1 reply      
"Looking at various examples of security advice we find that the advice is complex and growing, but the benefit is largely speculative or moot. For example, much of the advice concerning passwords is outdated and does little to address actual threats, and fully 100% of certificate error warnings appear to be false positives."

I've thought the same about highly restrictive network firewalls for years. Most threats today are 'pulled' in via http, e-mail, software update feeds, etc., or entry is made via phishing or social engineering. Highly restrictive firewalls don't do anything about any of that, and they impose significant inconvenience. Your firewall is security theater.

Part of the problem with security is that it's a gut feeling, unsupported "expert" opinion, and tech-folklore driven discipline. At worst it's cargo-cultish and almost superstitious.

For one example consider the extremely common -- and utterly dumb -- belief among many that NAT improves security. It's a superstition. How? What threats does it mitigate that can't be mitigated otherwise? Get concrete, give examples, show data. Nope.

jorgeleo 2 days ago 1 reply      
I think that part of the problem is that security is not explained in layman terms. There is a slang in security circles that it is not shared with the rest of the world.

Here is an example on how to explain SSL in simple terms. I have not seen many of this kind of water down explanations


Smushman 2 days ago 4 replies      
This article is from 2009. Most recent comment from 2010.

Possibly, it is meant as a reminder of something that we should all not forget?

Would the submitter please take the time to clarify the reasoning for necroing this article?

Smushman 2 days ago 0 replies      
I can't in good conscience mod you up, but I did laugh a bit at your comment...
Estimate the cost of a Web, iOS or Android app estimatemyapp.com
186 points by dynjo  1 day ago   151 comments top 31
ylg 1 day ago 4 replies      
This seems to me to be estimating the cost to get from a fixed spec to "works on my machine (or simulator)" rather than the total cost to go from an idea to shipped product with non-technical stakeholders (whom your estimator is targeting), a diverse product team, changing understanding of the target, i.e., all the bits that investors struggle to understand seem to be omitted or minimized.

I've found in many years of building apps for clients, the easy part is forecasting simple programming costs. The hard part is helping folks understand time spent keying in code is only a small part of shipping successful software--and one of the more predictable ones. So, the cynic in me reacts with: "great, another over-simplification that instills more of those unreal expectations that so often cause failure."

Minor points worth reiterating from others:

* The rates are off for areas other than yours (how about making rates/regions an option?). And, I'd make it much clearer that the viewer is likely looking at offshore outsourcing pricing as that is a very, shall we say, unique approach to new product development.

* "Man" hours is a small, visual irritant due to the problematic gender relations and inequities in our field at the moment.

CameronBanga 1 day ago 1 reply      
Biggest issue I have is that for anyone who uses this calculator, they will always see their app as "small" when it's really medium or large. Non-technical project stake holders will never grasp the depth of what they want built.

Second big issue is that there really isn't a good way in this calculator to estimate/account for proper testing and subsequently, time spent when a client changes their mind on a feature and wants to redesign/re-work.

amichal 1 day ago 5 replies      
Love the concept not sure the math works out:

Maxed out iOS app shows:

35 person-days UI/UX + 162 person-days dev = $88,650

$88,650 / (35 person-days + 162 person-days) / 8 hr/day = $56.25/hr per person average cost.

Assuming the labor estimate is correct, if anyone knows of iOS developers skilled enough to pull that off for 56.25/hr fully-loaded labor costs I'd like to meet them. (e.g. who is coordinating the work and working with the client on evolving specs/feedback)

duiker101 1 day ago 2 replies      
Holy sh... I have to say, I am very bad at making estimates. I don't work freelance but it happened that I had to make some projects for people. I tried to estimate the cost of some applications that I made/make in my spare time and the average is way way way higher than I would have ever charged anyone... I need to reconsider my prices...

Which is amazing to be honest, making prices I really hard for me and a tool like this can be an amazing opportunity to find my value. Thanks

fullwedgewhale 1 day ago 2 replies      
So having done cost estimation before, this is horrible. About all it would be good for is the initial swag at brochure ware. Generally speaking cost estimates should never be a fixed point and should be done in a range, with assumptions (that when violated) would allow the cost numbers to change. In addition, there's no measurement of complexity through requirements. Basically all these estimators are useful for only the simplest software projects with extremely limited requirements and scope. The cost of the software is a function of the known requirements which are carefully enumerated during costing to generate some sort of complexity model. (For example, function point analysis). That's fed into a cost model that translates that complexity model into a dollar range. That interval is then used to generate a final, contractual, number. But notice all the work that went into that estimate. What's more is that the estimate is refined during the work such that new data is incorporated into the complexity model, allowing changes in scope or requirements to be re-priced.
CSEngineer13 1 day ago 2 replies      
This is a pretty solid resource for freelance devs to mid-sized digital agencies. It falls short for more complicated applications that require custom architecture, a consultation piece, etc.

I think a lot of agencies/shops could benefit by creating a version of this that fits their pricing model based on previous projects -- Not as a hard estimate, but as a ballpark figure.

joshcanhelp 1 day ago 6 replies      
Not to detract from the tool but ... $56/hour default? That's low for any competent development, let alone iOS.
x1024 1 day ago 1 reply      
Doesn't work.

XMLHttpRequest cannot load https://d3h99m5mv5zvgz.cloudfront.net/api/v1/features/list. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https://estimatemyapp.com' is therefore not allowed access.

keedot 1 day ago 1 reply      
This is great. I can finally send people here when they think getting a site up can be paid for with a case of beer.
CodeSheikh 1 day ago 0 replies      
I hope none of my future clients come to me with a reference to this app for cost estimation. This estimation is vague and garbage. Good effort though.
brandonmenc 1 day ago 0 replies      
They must be trying to undercut everyone else, because those prices are way too low. I wonder how many times they go over these budgets themselves.
kemiller 1 day ago 2 replies      
I don't buy that Android is less than iOS for development, unless it's reflecting some shared work.
berns 1 day ago 0 replies      
Lets see... web app... small... click here... click there... one more click... $ 14.000I don't know what I've been doing the last 25 years but now I know that I need the next 7 years to be a millionaire. Thank you!
AznHisoka 1 day ago 0 replies      
Yep, it costs $38,000 to make a Google.

Except you won't have the search algorithm, machine learning algorithms, and Hadoop stuff. But at least you'll just have a nice polished UI, with integration with Facebook, Twitter and Google+!

ommunist 1 day ago 0 replies      
Curiously the page does not work on iPad in built-in HN browser. Anyway, its priceless since will give exact answer to most of those wanting "my own branded messenger, and I only have 500 for this".
dynjo 20 hours ago 0 replies      
By popular request, 'man days' has now been banished from the site :-)

In our defence: https://en.wikipedia.org/wiki/Man-hour

marpalmin 1 day ago 0 replies      
We have a similar concept, with a far less fancy estimator. http://www.decemberlabs.com/appEstimator.htmlI think these estimators are good at helping non savvy potential customers tu understand the actual costs of making an app.
omouse 1 day ago 0 replies      
This is clever but it definitely needs to be backed by real data and adjusted in the future. This would be a great use case for all the data that's been collected for research into programmer productivity!
buro9 1 day ago 1 reply      
Would love to see examples of their work, showing how that translates into the options in the estimation tool.

i.e. What is their definition of "polished"?

hcbogdan 1 day ago 1 reply      
Site with Access-Control-Allow-Origin header error:

XMLHttpRequest cannot load https://d3h99m5mv5zvgz.cloudfront.net/api/v1/features/list. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https://estimatemyapp.com' is therefore not allowed access.

quakenou 1 day ago 0 replies      
Most sites do it already:


sarciszewski 1 day ago 1 reply      
So, if we go off the market rates for a business consultant skilled at developing a web application ($1k to $2k per day per consultant), and plug in the numbers here:

35 Man Days UX/UI Design -> probably $35k to $70k

178 Man Days Development -> $178k to $356k

The total cost is: $208k to $416k

The calculator's estimate: $95,850

So the actual cost is going to be anywhere from 2x to 5x higher than what this projects.

slamus 1 day ago 0 replies      
I'm against this website.
jheriko 1 day ago 0 replies      
really doesn't fit my style of apps at all. but i can see how this might be useful for cookie cutter style work...
j45 18 hours ago 0 replies      
What I like about this is it introduces the concepts that can impact mobile app development to a crowd that has no way to know where one feature ends and another starts. If it's meant to be a conversation starter, I bet it might for non-experts.
evook 1 day ago 0 replies      
Is there any advantage over an excel sheet?
nso95 1 day ago 1 reply      
Why the hell don't the buttons work on mobile safari?
egeozcan 1 day ago 5 replies      
I'm not a native speaker but I always thought the word "man" is also used as a short way to say "human".

> the human individual as representing the species, without reference to sex; the human race; humankind [0]

[0]: http://dictionary.reference.com/browse/man

onion2k 1 day ago 2 replies      
I chose an iOS app with every option, and then added an Android app with every option. The price doubled. I guess that means Oozou will charge you twice for designing or coding assets that'll be reused between platforms.
oliwarner 1 day ago 5 replies      
There are some outrageous time estimates in this. 7 days to create an auth system with the big 3 social integrations? 6 days for private messaging? 2 days to dump a prefab social sharing blob in your template?! I'd do all of that in a morning.

I'd just be careful if you're considering using this to quote your jobs. If somebody working for me quoted these sorts of numbers, I'd fire them on the spot. I wouldn't even give them a second chance. It would clearly demonstrate you didn't know what you were talking about.

Getting too little sleep can have serious health consequences (2013) theatlantic.com
164 points by gmays  2 days ago   69 comments top 17
otherusername2 1 day ago 8 replies      
I once, involuntary, went without sleep for around 72 hours. I was at a music festival and we were either pulling through the night or I couldn't sleep because of the noise. When the time came to go home, I really felt pretty good. At that time I had been awake for around 60 hours. My body was tired and my thinking was a bit sluggish, but nothing too bad...

Until I got home and tried to sleep. By then I was feeling dead-tired, but I simply couldn't sleep. This wasn't your average "Oh I can't sleep, guess I'll do something else and try again later" case of 'insomnia'. I was so extremely tired; all I wanted was to sleep. I started having hallucinations much like the early stages of a mushroom trip (minus the fun). My eyes couldn't focus, I couldn't think.

Finally I managed to fall asleep (while constantly suppressing panic attacks; something I've suffered from in the past and know how to deal with now). I had lucid dream after lucid dream during that sleep. It was very unsettling as I got the feeling (during lucid dreaming) that I wasn't getting any "real" sleep and I'd go insane.

All in all the lucid dreaming aspect was pretty cool, and in retrospect perhaps worth the unsettling experience of being awake so long. But I'll never ever want to repeat that experience.

narsil 1 day ago 0 replies      
Psychiatric disorders such as schizophrenia are more likely to manifest in those already susceptible to them, if they are subject to sleep deprivation. This remains one of the few known correlations in long-term effects of sleep deprivation.


icehawk219 1 day ago 2 replies      
Sleep is something I've kind of battled with for some years now. I have a job with a long commute and long hours but like to still be able to come home and make dinner and watch an episode or two of some of my favorite shows but when I do I end up awake instead of tired. If I do that I get 4 - 5 hours of sleep at night but if I get the full 8+ I basically have no life but work and sleep.

Basically everyone I know always tells me "Only 4 - 5 hours? That's not healthy!". But here's the thing, I don't feel like it. I wake up fine, go about my day, and usually fall asleep when I'm tired. I don't need to chug caffeine to stay awake or take shots of some energy drink. Once I wake up I'm wide awake and alert despite getting so much less sleep then everyone tells me I'm supposed to.

yAnonymous 1 day ago 3 replies      
"Spending 25% more time sleeping increases your life expectancy by 25%."


darkFunction 1 day ago 1 reply      
This reminds me of a programmer and very talented blogger who wrote a book ("And Then I Thought I Was A Fish") about his experience of staying up for many days under the influence of LSD, and who is also committed to an institution as a result of his delusions. Very interesting and entertaining read. http://www.stilldrinking.org/the-episode-part-1
gdubs 1 day ago 0 replies      
Funny how not sleeping is still considered a badge of honor -- a demonstration of strength. As a new parent, I've let go and will happily go to sleep at 8PM if the opportunity arises.
vincentbarr 1 day ago 0 replies      
I've found CBTI for Insomnia[0], an asynchronous 'course', to be helpful in improving my sleep. Of note, some elements of the program particularly the metrics/habits that are most important to improving sleep were not factors I had previously focused on or considered really.

I'm 3 weeks into the program and this is the first time I've been even remotely successful at improving my sleep.

I sound like an advertisement.

[0] http://www.cbtforinsomnia.com/

nemesisrobot 1 day ago 0 replies      
I've had the opportunity to sleep in a lot lately, and I've found that I feel physically and mentally the most rejuvenated around the 9 hour mark. Shorter than that and my thinking is on the sluggish side, and longer and I feel tired and unmotivated. Personally, I've found the time that I wake up to also be a factor. Waking up at 7 or 8 in the morning after 9 hours of sleep has me in a better mood than at 10 or 11 with the same amount of sleep.
jameshart 1 day ago 0 replies      
The headline is, of course, tautologically true. Too much|little anything can have serious consequences, because "too much" or "too little" is defined by its negative consequences. "Getting enough sleep" can't have serious health consequences, because... it's enough not to. Too little sleep is "too little" precisely because it has health consequences.
Avalaxy 1 day ago 3 replies      
"Too little sleep can have serious health consequences"

"Too much sleep can have serious health consequences"

alexholehouse 1 day ago 1 reply      
One angle not touched on here is the link between Alzheimer's disease (AD) and sleep. There has always been evidence that AD patients sleep less than the general population, but obviously there's a chicken and egg problem here - are they sleeping less because of neuronal disruption caused by the disease, or is the disease progression catalyzed by a lack of sleep[1-3]? While I think this is still an open question, there's growing research that interaction with sleep related systems in the brain can have a major therapeutic impact on the disease itself - obviously this doesn't actually answer if it's 'sleep' or some process related to sleep, but clearly something is going on here [4].

From the press release associated with [4];

The new research, in mice, demonstrates that eliminating the protein called orexin made mice sleep for longer periods of time and strongly slowed the production of brain plaques.

This indicates we should be looking hard at orexin as a potential target for preventing Alzheimers disease, said senior author David M. Holtzman, MD, head of the Department of Neurology. Blocking orexin to increase sleep in patients with sleep abnormalities, or perhaps even to improve sleep efficiency in healthy people, may be a way to reduce the risk of Alzheimers. This is important to explore further. - [5]

[1] Xie, L., Kang, H., Xu, Q., Chen, M. J., Liao, Y., Thiyagarajan, M., Nedergaard, M. (2013). Sleep drives metabolite clearance from the adult brain. Science, 342(6156), 373377. http://www.sciencemag.org/content/342/6156/373

[2] Ju, Y.-E. S., McLeland, J. S., Toedebusch, C. D., Xiong, C., Fagan, A. M., Duntley, S. P., Holtzman, D. M. (2013). Sleep quality and preclinical Alzheimer disease. JAMA Neurology, 70(5), 587593. http://www.ncbi.nlm.nih.gov/pubmed/23479184

[3] Ju, Y.-E. S., Lucey, B. P., & Holtzman, D. M. (2013). Sleep and Alzheimer disease pathology - a bidirectional relationship. Nature Reviews. Neurology, 10(2), 115119. http://www.nature.com/nrneurol/journal/v10/n2/abs/nrneurol.2...

[4] Roh, J. H., Jiang, H., Finn, M. B., Stewart, F. R., Mahan, T. E., Cirrito, J. R., Holtzman, D. M. (2014). Potential role of orexin and sleep modulation in the pathogenesis of Alzheimers disease. The Journal of Experimental Medicine, 211(13), 24872496. http://jem.rupress.org/content/211/13/2487

[5] https://news.wustl.edu/news/Pages/27721.aspx

marincounty 1 day ago 0 replies      
It seems like every person I know has problems with their sleep. My only reference is the United States. I wonder if other countries have such a high percentage of the population that has sleep problems?

I didn't read the article. I have read so many sleep studies; I have kinda given up on every having the sleep the kind of sleep I had up until about 25. I hate not getting enough sleep. To me, sleep is more important than money? I once remember thinking I would sell my soul to the devil in order to get some good, sound sleep. (God forgive me--just joking. Yea, I'm still not positive, and don't want to piss him off.)

rrodriguez89 1 day ago 0 replies      
"Modalifinil" the solution to all the problemas
ilaksh 1 day ago 1 reply      
Is it really such a mystery anymore?

Routine maintenance is prudent for any system. Sleep is a maintenance mode.

At night, the sun goes down, it gets cold, predators come out. You save a ton of energy and maybe your life by avoiding activity at that time. When its cold you burn extra calories just to stay warm.

Haven't biologists got a lot of evidence by now?

dominotw 1 day ago 0 replies      
no shit!!
richmarr 1 day ago 0 replies      
Oh dear
How I nearly almost saved the Internet, starring afl-fuzz and dnsmasq skullsecurity.org
154 points by xorrbit  1 day ago   27 comments top 3
joosters 1 day ago 3 replies      
The crossed out '2025' in the timeline is bizarre. Have I missed some joke here, or does the author not like to use the delete key?
viraptor 21 hours ago 0 replies      
I really didn't expect dnsmasq to count the number of hops. I found a very similar issue in systemd-resolved, but with a different fix:


The compressed label should never be allowed to refer to itself in the first place, so there's no point in counting how many times you loop.

JadeNB 1 day ago 2 replies      
Why was the title changed from the blog post's title ("How I nearly almost saved the Internet ...")?
A Complete Guide to Rails Caching nateberkopec.com
171 points by nateberkopec  2 days ago   55 comments top 12
varmais 2 days ago 1 reply      
> Developers, by our nature, are very different from end-users. We're used to the idea that when you interact with a computer, it takes a little while for the computer to come back with an answer.

I don't agree. End users are used to shitty systems which take some time to load, but we developers should know better. We should be able to recognise performance bottlenecks during the architecture design and development, also we should be able to measure the layers where optimisation is necessary and useful. And when developing web app, caching should always be in mind as one tool to improve system performance in one way or another.

Otherwise fantastic piece of information.

douglasfshearer 2 days ago 0 replies      
Fantastic resource.

No mention of HTTP caching though, which for a whole class of applications is a great way to minimise rendering time.

Rails has great support[1] for etag based content expiry.

[1] http://api.rubyonrails.org/classes/ActionController/Conditio...

blizkreeg 2 days ago 3 replies      
Does anyone here use compression on all of their cached content? Our Redis cache store has ballooned up to about 8G (we store a lot of html/json fragments) and is unwieldy to ship around when we want to debug bad data bugs on our dev machines. We are experimenting with lz4 compression now and the speed-compression ratio tradeoff looks pretty good with it.

What has been your experience with Rails caching + compression?

beefsack 2 days ago 0 replies      
I found the animated GIF next to the article to be incredibly distracting while reading. I had to remove it using the inspector to be able to get through the article.
driverdan 2 days ago 1 reply      
This is a great guide but it's not complete. One of the biggest problems with all of these guides is that they focus solely on view caching.

As far as I can tell the Rails ecosystem completely lacks a good model / data caching system. There are a few gems that do model caching but they all have major flaws. I'd love to see a good guide on Rails data caching and gems that eliminate the mess of calls to Rails.cache.fetch.

arohner 2 days ago 0 replies      
Great article. You should additionally measure page speed as experienced by your users, because other pesky things like network congestion, the user's browser & hardware and the speed of light all affect website performance. If you measure from every user's browser, you'll get very detailed performance info. A chart from a recent blog post of mine: https://s3.amazonaws.com/static.rasterize.com/cljs-module-bl...

Just because the page loads quickly on your laptop doesn't mean it loads quickly for everyone. I'm working on a tool to measure this stuff: https://rasterize.io/blog/speed-objections.html, contact me if you're interested in early access.

Contact me if you're interested in an early preview.

chrismorgan 1 day ago 0 replies      
> First, figure about 50 milliseconds for network latency (this is on desktop, latency on mobile is a whole other discussion).

And outside the USA, add on another 200ms more. I, an Australian, visited the USA last year and was surprised, although I had expected it, at how much faster the Internet was.

I get the general feeling that people in the USA end up much more picky about performance than their Australian counterparts, because theyre used to a result which is quite a bit faster anyway.

Its rather a pity that we as an industry dont care more about performance, because on average were utterly abysmally appallingly atrocious at it. Theres simply no good reason for things to be as slow as they are.

resca79 2 days ago 1 reply      
this is a great article

> The most often-used method of all Rails cache stores is fetch

It's true, but I think you should add performance tests while the app write/read because the big problem of db/cache is the write that influence also the read(fetch).Another big problem is the expiration vs garbage collection after the memory is full.

why-el 2 days ago 1 reply      
> but usually virtualization is a mostly unnecessary step in achieving production-like behavior. Mostly, we just need to make sure we're running the Rails server in production mode.

Isn't this assuming your development mode has the same memory/cpu as your production? I can't tell you how many times I get questions from clients who ask why their 16GB dev box is running fine while their 512MB dyno is slow. The point of a docker image is to limit these resources, which `rails s production` does not do.

drm237 2 days ago 0 replies      
In the example where you cache the todo list and use an array of the todo item ids plus the max updated time, why is that approach better than just updating the list updated_at time whenever a child record is changed? With your approach, each todo item needs to be pulled from the db which should be slower than just checking the parent's updated at time for a cache hit.
derwiki 2 days ago 1 reply      
Why is `caches_action` not considered a best practice? Using that for popular landing pages, I'm able to serve over half of my requests in < 10ms.
rubiquity 2 days ago 3 replies      
> Why don't we cache as much as we should?

An alternative question:

Why do we have to cache so damn much in Ruby and complicate the hell out of our infrastructure and code?

Because Ruby is slow.

       cached 18 July 2015 02:11:03 GMT