hacker news with inline top comments    .. more ..    28 Dec 2014 News
home   ask   best   4 years ago   
German Defense Minister von Der Leyen Fingerprint Copied by Chaos Computer Club
36 points by sjreese  1 hour ago   28 comments top 8
SEJeff 16 minutes ago 0 replies      
A fingerprint is a good username and ridiculous as a password. This excellent article captures my thoughts pretty well:


rfrey 9 minutes ago 1 reply      
When I was working in biometrics 10 years ago, nobody thought fingerprints were a good userid OR a password. It was always presented as a part of multi-factor authentication, to be used in conjunction with passcards and/or passwords.

"Something you have, something you know, something you are".

georgespencer 5 minutes ago 0 replies      
Biometric security on your cell phone is still a great tradeoff between security and convenience when compared to four-digit PINs.
HackinOut 46 minutes ago 0 replies      
Although I have no doubt it could work, I guess they didn't try the copy? Couldn't find the video of the conference. He probably demo'ed using a copy of his own fingerprint from a photo?

It's great work, I hope the fact that you can make a copy from a simple HD photo will bury people's ideas about fingerprints security for good.

spacefight 51 minutes ago 1 reply      
In other words: biometric identification is still broken.
junto 1 hour ago 2 replies      
It didn't mention it specifically, but I presume you could then 3d print a replica?
runn1ng 18 minutes ago 0 replies      


jacquesm 20 minutes ago 2 replies      
That's not the first time they've done this:


Look, no hands
443 points by jamesisaac  13 hours ago   101 comments top 43
mvandy 4 hours ago 12 replies      
I'm Michelle (the girl in the video). This thread is really fascinating to me. I'll add some more details to the discussion below. I was diagnosed with epicodylitis (tennis albow) and I tried a range of different treatments: electrical therapy, accipunture, anti-inflammatory tablets and some kind of light therapy(?) But without any results (probably why I can't remember the last treatment).

Also notised there were some questions about neck pains, If I adjust the nose-pad to the right level and angle it right it forces me to sit up straight. However, in the video my back is arched quite a lot because I was sitting at a table that was too low (but it was only a temporary setup).

arafalov 1 hour ago 0 replies      
I had RSI 3 times. The first time I did not know what it was and ended-up not being able to hold a pen. Took many months of recovery and I was lucky my doctor knew what it was. I had heat treatment, some sort of laser and vibration therapies and a couple of other things. Don't know which of them helped, but they sure did.

The second time, I was in a different country and was sent to a surgeon. He had no clue what RSI was and said that I should not worry until it gets worse and ready for Carpal Tunnel surgery. I said "no thanks" and found a sports therapist instead who was ready to help.

The main lesson here is that "see a doctor" does not always work. I am sure if I saw surgeon the first time, I'd be having some stitches on my wrists right now. Back then, RSI was not well known thing at all. Hopefully, it is more easily recognized now.

arjie 11 hours ago 5 replies      
The elasticity of the brain is incredibly impressive. Very neat solution. My parents are surgeons, and I know they worry at least a little bit about the safety of their hands. While my hands are definitely very important to my being able to work I know I don't worry as much as they do. That's funny, because for the most part I think about Carpal Tunnel Syndrome as a career-ending condition.

I recently suffered some amount of tendonitis in my wrist and it prompted me to make quite a few changes:

* Better posture

* Better seat adjustment

* A nicer (mechanical) keyboard

* Practising touch typing more (i.e. correcting myself any time I use the wrong finger)

* Resting my wrists evenly

I made all these changes simultaneously so I don't know what changed it or if it was a one-time thing and resting fixed it. An interesting thing is that I found that I overuse my right hand.

`vim` binds simple movements to hjkl and that's fine because they're on the home row, but it also means that a lot of the time I'm holding down one key while reading code. I've switched to moving around code better now, using larger jumps, and when scrolling a lot I use my left hand. I've also rebound some other things so that they're easier with home row keys. Anyway, learning to be faster at all this took very little time. I am very impressed with how fast we learn new acts if they keep repeating them.

_ZeD_ 11 hours ago 3 replies      
Sincerely... all I can think about is not that she found a good solution to her problem, but the fact that she "achieved the same level of accuracy" (and I suppose speed).

This is not something about her capabilities, but about the limitation of current input devices regarding our hands.

This is the proof that using a touch pad with your nose is not worse that using it with your hand. There is something wrong in this: try using any real-world interface with your hand, how the shapes, the stiffness, the flexibility of any handle, pen, button, spring you interact with give you some kind of information and let you operate with a superior kind of consciousness.

felixg 1 hour ago 0 replies      
SmartNav by NaturalPoint (http://www.naturalpoint.com/smartnav/).I'm using it for over 8 years for moving the cursor. Clicking is via a regular mouse, though voice and switches are supported. All typing is done through an on-screen virtual keyboard (Hot Virtual Keyboard). After short self-training the speed and accuracy are above of a regular mouse. Neck fatigue is a rare issue even after 8-10 hours of daily usage, the "trick" is calibrating the settings for minimal movement.I'm using this setup for everything from programming to some light gaming (RTS). Adjusting and configuring the environment can bring substantial improvement. For example, using ReSharper for Visual Studio vastly reduces typing.
userbinator 12 hours ago 1 reply      
I wonder if her neck started getting tired from all that movement.

It seems that it's mostly people who grip their mouse really tightly/type with tense fingers that experience the most problems - I remember when I first started typing, my fingers tired too easily because the keys were heavy, and I was exerting a lot of force trying to get the fingers to exactly where I wanted them to go. Later, when I got a "looser" keyboard and discovered that I didn't really need to hit the keys exactly in the middle but whatever could actuate them worked, my speed more than doubled and I could type for hours without feeling tired at all. The relaxation really helps. Same with mousing - if you find that you have to grip your mouse tightly to make precise movements, turn down the DPI and try lubricating it so it requires as little effort as possible to move. Personally, I don't really like using trackpads because of that friction.

noonespecial 4 hours ago 1 reply      
File in the "for what its worth" file:

When I started to get some pain in my wrists, I noticed it most when I was using photoshop and clicking a great deal. I tore apart an old USB mouse and wired a pair of foot switches in so I could click with my feet as well. It helped quite a bit. Now I rather prefer it, especially the right click action.

IkmoIkmo 3 hours ago 0 replies      
Beyond the obviously admirable content of the story, just wanted to say: beautiful website, the pacing of the story was really well done. Loved the animations, the optional extra reading, breaking up the story with pictures. Very simplistic but well done.
qwerta 2 hours ago 0 replies      
My cousin was born without hands. He is using his feets for most things. Today he is 22, he drives a car, handles computer and tablets... His typing is ok, so he could be programmer if he chooses to. And he wants (and probably will) to move to his own place.

He does not use special devices, just big mouse and large AT/IBM keyboard.

PL1 1 hour ago 0 replies      
Have you checked Dr. Sarno's approach for RSI and related problems. If you have tried all those options and they didn't work you should definitely give a try.

You can check a story very similar to mine:http://www.pgbovine.net/back-pain-guest-article.htm

Disclaimer: I am CS PhD student in a top tier US school. Suffered from RSI, tried everything. Was cured by Dr. Sarno's technique.

meepmorp 10 hours ago 4 replies      
Just reading the responses here - am I the only person who has zero problem using a trackpad and keyboard with my hands for >=8 hours a day? Maybe it's that I'm not a designer and so the precision of my motions isn't such that I'd experience problems like this. I think the only problem I've ever had was back when I tried using a Microsoft Natural keyboard, and my hands were sore after an hour.

Also, good for her, finding a solution that works well for her, strange though it seems at first blush. I like her work, too.

EsotericSoft 5 hours ago 0 replies      
I've tried head tracking devices and found my neck and shoulders get stressed. I imagine the same thing would happen hunched over a trackpad with the nose. The small neck movements are similar to head tracking.

Check out the Imak SmartGlove with Thumb, I can't use a computer without it. Well, of course I can but it is much less comfortable. The glove plus a Kensington Expert Mouse (which is actually a trackball with a scroll ring) plus a good chair (eg Steelcase Leap) will help a LOT.

yourad_io 1 hour ago 0 replies      
Just FYI - if you're interested, make sure to read the full story in No.2. "Read more" shows the rest of it.
jacquesm 12 hours ago 1 reply      
That's got to be tough on the skin at the tip of the nose, that's not exactly the most rugged spot to be rubbing on something all day long.



auvi 12 hours ago 0 replies      
I remember CMU (or a site that was linked from a links page in CMU vision site) had a project called Nouse where you can operate the computer with a video camera using your nose. The nose doesn't touch anything.
acqq 4 hours ago 0 replies      
Also worth taking a look:

"The Association of Mouth and Foot Painting Artists of the World"


"VDMFK supports and promotes artists who, due to disability or disease, cannot create their works of art with their hands, but have to use their mouths or their feet."

fillskills 12 hours ago 0 replies      
Great example of adaptability and resilience. I understand a little bit about her pain as I had to move to my left hand after experience pains in my right. Though I did nothing as extreme as the OP.

Thanks for the inspiration Michelle

plicense 7 hours ago 0 replies      
I just went "Holy shit" when I read this article and realized what a lazy dumba*s I am.
Morphling 8 hours ago 0 replies      
I just wonder if she could've easily fixed all the issues by changing her mouse or how she hold it instead of opting for this nose-touch-pad.

If she is still doing thous 11-15 hour work days and her neck doesn't get tiered, I really think she might have had too hard of a grip on her mouse or just a bad mouse in general.

quantgenius 3 hours ago 0 replies      
Hi Michelle, I was wondering if you had tried the Tyler Twist or the reverse Tyler Twist. I had tennis elbow and golfer's elbow and nothing worked until I tried those exercises.
known 56 minutes ago 0 replies      
I salute your perseverance.
melling 9 hours ago 0 replies      
She mentions being excited about the Leap Motion. That's not very accurate so she probably had to give up. There is better hand tracking software under development. This company's Kickstarter project got canceled when Oculus bought them: https://www.kickstarter.com/projects/nimblevr/nimble-sense-b...

And Control VR uses gloves but might prove to be very accurate: http://controlvr.com

The VR headsets are driving the development of this technology.

bilalq 10 hours ago 1 reply      
This is really amazing and inspiring to see. Every now and then, I get scared when thinking about what would happen if my eyes or my hands started to lose functionality.

I sometimes feel pain in my pinky fingers when typing, forcing me to adapt to using only my other fingers.

unclesaamm 10 hours ago 0 replies      
Or: how I developed carpal tunnel in my neck

Just kidding. Glad to see such resourcefulness.

roseperrone 1 hour ago 0 replies      
RSI is curable almost always! Read "Pain Free at Your PC" or "Pain Free" by Pete Egoscue. Message me and I'll gift either to you.
owenwil 11 hours ago 0 replies      
This is awesome. Fun to see someone trying something new and pushing the boundaries of how computer interaction is done, considering how stressful on the body it can be to use a mouse.
johnny99 9 hours ago 0 replies      
Dominic Wilcox created something similar to this in 2011, I think as a satiric art project. It looks like a plague doctor's mask:http://dominicwilcox.com/portfolio/finger-nose-stylus/

To see a proper implementation, which allows someone to work at a high level, is awesome.

Kluny 9 hours ago 0 replies      
Great story, and great reminder not to let minor pains progress to the point where they cripple you. I say this as someone whose right thumb is in a brace due to deQuervain's tendosynovitis, brought on by banging too hard on spacebars.
zameerb 10 hours ago 1 reply      
Speaking as someone who has done quite a few of these Carpal tunnel surgeries, the story do not fit very well with it. The symptoms are worse at night and tremors are unusual. Common complaints are pain, pin and needle sensation, weakness
ballpoint 4 hours ago 0 replies      
You obviously have a nose for good design.

Seriously though, that is awesome.

malkia 11 hours ago 0 replies      
Reminds of this TED with Phil Hansen: Embrace the shake - http://www.ted.com/talks/phil_hansen_embrace_the_shake?langu...
rebootthesystem 10 hours ago 2 replies      
I stopped using mice, knobs (IBM/Lenovo) and touchpads (built-in or external) probably twenty years ago. A few months into an intense design project I started to feel burning pain on my wrists. I was working 18 hour days, 7 days a week. Yes, if I was awake I was in front of the computer.

This was a hardware and software project and I was doing it all. This meant lots of precise motion at times. Running Solidworks or Altium Designer often meant very accurate tiny movement while pressing down on a button. Horrible stuff for your wrist.

I had been exposed to just how bad this could get. I was friends with several people who did visual effects for motion pictures. Same kind of work. They ran 3D workstations for a dozen or more hours per day, every day. One fellow had to have surgery on both wrists due to the damage he caused. His was always in pain after that.

I decided I had to deal with the situation. I didn't want to end-up like that.

First decision was that mice and touch pads where horrible input devices. I tested everything and concluded that low friction thumb-operated trackballs were the best.

Beyond that, the relative angle of the hand to the forearm seemed to have a HUGE effect on causing inflammation, pain and injury. The flatter and more relaxed,the better. In fact, the most relaxed position had my hands drooping over the keyboard and trackball with virtually no tension on the upper tendons. This meant my standard desk had to go.

What I needed was a desk with a cavity into which my hands would droop and meet the keyboard or trackball. My forearms had to be fully supported in order to remove pressure from shoulders and posture.

I welded together a few iterations of the idea and ended-up with a desk that was just fantastic. I could work on this thing for 16 to 18 hours a day and have no wrist burn whatsoever. Of course, I also implemented regimented breaks and exercises, but the desk, as well as switching to a trackball, made the most difference.

I can't help but think this girl did herself huge damage by using the touch-pad for long hours. I particularly dislike touch-pads on laptops (of any make and model) the are in the wrong place and add tension to your tendons precisely where you don't want it.

As for Michelle, wow, what an amazing person she must be.

colordrops 11 hours ago 1 reply      
I know that this is off topic, but what a beautifully designed website!
enigami 10 hours ago 0 replies      
A need or problem encourages creative efforts to meet the need or solve the problem - Necessity is the mother of Invention. Kudos!!
djloche 12 hours ago 0 replies      
This is an excellent modern example of the tenacity of humanity.
Tombone5 5 hours ago 1 reply      
tl;dr. Young ambitious person gets first chance at job she is trained for, feels pressure to work long hours without complaint. Because of too much work, now she can't use her arms half of the time, instead draws with nose and made portfolio site with nose-drawing as central focus.
m48 11 hours ago 1 reply      
I also have hand problems. I use speech recognition to type everything, including code, and a game controller to move around the mouse. The controller I'm currently using, a PS4 controller, has a trackpad in the middle, so I was able to mash that against my face and give her technique a shot.

It works better than you'd think. The PS4 trackpad isn't exactly brilliant, but I can move around the mouse and click on what I want to with some accuracy. Of course, the trackpad on the controller is very small and not very accurate, so it's not really practical for artwork or anything. But, with a better, larger trackpad, I can imagine this technique actually working. I might give it a shot at some point.

I am a bit worried about the inevitable neck and nose pain, though. I wish she had gone into a little more detail about how she avoids that. Maybe she just has a neck of steel?

For the curious, these are some other resources I've found about people working around RSI. Most of these are about using Dragon NaturallySpeaking to code by voice, since that's what I'm most interested in, but I think it's still interesting.

There really needs to be a list somewhere for open-source workarounds to disabilities. To the best of my knowledge, there really isn't one.

Natlink + Dragon NaturallySpeaking:

(NatLink, which lets you make custom speech commands for Dragon in Python, is currently being developed at http://qh.antenna.nl/unimacro/index.html, but that site's pretty incomprehensible. The original author's site is at http://www.synapseadaptive.com/joel/welcomeapage.htm. It's pretty out of date, but explains the fundamentals of the system better, I think.)

https://www.youtube.com/watch?v=8SkdfdXWYaI don't bother looking around, the source code of this was never released)



Libraries for using Dragon NaturallySpeaking on Linux with VMs:



It seems a post about this kind of thing pops up about every other month or so. I'm thinking of showing off my system here when I polish it up a bit. It's not nearly as complicated as some of these other ones, but I'm beginning to get pretty close to normal typing speed coding by voice.

bhaumik 11 hours ago 0 replies      
This, my friends, is a good ole fashioned "hack."
brianbarker 6 hours ago 0 replies      
Integrate a tissue on the nose pad and you're ready for any season!
caser 6 hours ago 0 replies      
This is amazing.
IanDrake 13 hours ago 1 reply      
Neat, but something tells me this won't be catching on.
bbarn 12 hours ago 6 replies      
Only one mention of a doctor, once in the whole page, and he says to give it a rest and she'll be ok. Then she says she spends 50% of the time using her hands normally? No mention of further deterioration, or further attempts to figure out the problem? It seems like it's more important to her to say "look I am clever" than actually solve the problem here.
sukilot 9 hours ago 2 replies      
Most people tend to get RSI when they first he very active in computer work,and after a year or so they find a healthy posture and their hands strengthen and it isn't an issue.

This article seems like a gimmick by someone who wants to be special instead of using perfectly effective solutions. Or it is a joke.

Taking Chrome DevTools outside of the browser
58 points by gidea  4 hours ago   5 comments top 5
Honzo 46 minutes ago 0 replies      
This is really neat.

FYI to run Chrome in remote debugging mode on a Mac

  > /Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --remote-debugging-port=9222

xasos 30 minutes ago 0 replies      
This looks super and interesting, and I'm definitely going to try it out when I'm back at home.

GitHub repo if anyone is interested (With install instructions): https://github.com/auchenberg/chrome-devtools-app

chippy 40 minutes ago 0 replies      
How does this differ from the default Chrome DevTools remote debugging over the wire? Doesn't that also run it's own http server in a way?


esamek 58 minutes ago 0 replies      
From the authors philosophical standpoint, I think this is indeed very interesting. I hate having to open up multiple instances of the dev tools since they are tied to the tab instance.
mijoharas 57 minutes ago 0 replies      
This seems like a really interesting idea, since I'm not by a computer right now, has anyone played around with it yet? How does it seem?
Rebrand Stage Fright to Overcome It
12 points by SwellJoe  2 hours ago   2 comments top 2
EGreg 15 minutes ago 0 replies      
This is definitely true. I say this as someone who has performed classical music in various venues like Carnegie Hall etc. (http://magarshak.com/piano) I started as a kid when I didn't feel much stage fright at all, and after a break of a couple years I remember the stage fright coming on. I was a pianist, so what if I would make a mistake? Feeling like I needed to calm down and concentrate caused the jitters. Whereas traning to express yourself and just perform in the moment takes it away.

In general, accepting some kind of discomfort helps the body to accept it and deal with it. I discovered this a few years ago when walking through biting cold and deciding to EXPECT the sensation and accept it. My body kicked up the temperature production, and I felt a bit like that guy who can regulate his body temperature naked in the snow. Although of course less genetically adapted than he is :)

halfcat 1 hour ago 0 replies      
This sounds like the same concept from Kelly McGonigal's TED Talk, "How to Make Stress Your Friend" [1]. For those who don't want to sign up to read the Scientific American article, the video will give you the gist of the article.

[1] http://www.ted.com/talks/kelly_mcgonigal_how_to_make_stress_...

Microservices Not a freelunch
106 points by olalonde  12 hours ago   57 comments top 11
Someone 3 hours ago 1 reply      
See also:

  - Subroutines - Not a free lunch  - Libraries - Not a free lunch  - Client-Server - Not a free lunch  - OO - Not a free lunch  - Multiprocessing - Not a free lunch
Many of these arguments apply to these, too. The main difference is that of scale. Latency and call overhead are way larger than for the examples I gave.

Improved tooling can bring both down, but not by that much. That's why you will not see many micro services that are truly micro. You won't see (I hope) a 'printf' micro service, or not even a 'ICU' micro service. A regex service might make sense, if the 'string' it searches in is large and implicit in the call (say a service that searches Wikipedia), but by that time, it starts to look like a Database query. Is that still micro?

gbog 4 hours ago 3 replies      
Listing the difficulties on microservices is useful, but let's not forget the difficulties encountered in monolithic big applications, which are the reason for splitting into smaller pieces.

In the big app 1) a single syntax error breaks everything. 2) simply loading all the app takes a huge part of ram, and tests are very slow. 3) there is a gigantic dependency tree, as when you depend on a module you also depend on every ones of the module dependencies. 4) Almost no one knows everything about the app. 5) it is impossible to split the company into separate services without getting fights about shared code and architecture decision.

olalonde 6 hours ago 13 replies      
On a related note, is anyone aware of any tool(s) to declaratively describe an application architecture and automate its deployment on something like AWS or in a single virtual machine + docker? For example "I want 10 replicated microservice1 processes and 5 replicated microservice2 processes in front of a nginx load balancer. Also, throw in a Redis and a PostgreSQL database." This would be immensely useful for (one man show) developers who don't necessarily have the time or experience to "properly" setup and fine tune infrastructure. I think what I'm describing is known as "Infrastructure as code" in DevOps parlance.

Edit: Since this question is getting many replies, I'll be a bit more specific in what I am looking for. Is there a tool that would let me describe the infrastructure and deploy it on a given cloud provider but also have the ability to deploy the same infrastructure on my local machine (using VMs/docker) for development purposes.

polskibus 5 hours ago 0 replies      
In my opinion the article does not put enough weight on lack of transactions in a microservices architecture. Of course you can add distributed transactions, but doing so makes the architecture much more intertwined and much less error-proof.
aidos 5 hours ago 1 reply      
I enjoyed the article. As ever, there are tradeoffs and your milage will vary using different approaches.

This doesn't seem like an entirely fair comparison:

"It seems to me that all three of these options are sub-optimal as opposed to writing the piece of code once and making it available throughout the monolithic application."

There's a different scenario that could have played out with how to share a library between different services. You could have written the bulk of your application in the same language, like a monolithic application but split into several services. In that case you could create a library for your tax calculations and use it freely within your services.

For me, I split my application into a small number of services and, as much as possible, split things out into libraries to make reuse simple (more libraries, thiner applications).

Sometimes I use different languages, but when I do, I consider very carefully whether the (rather large) tradeoff that presents will be worth it in the long run for what I'm getting in the short term.

A question: when people talk about microservices, how small are they talking?

ExpiredLink 4 hours ago 0 replies      
> Implicit Interfaces

This is a crucial point. 'We use microservices' does not say much unless you can describe how you design consistent and granular service interfaces. Otherwise you most probably just produce microservice spaghetti.

mavdi 2 hours ago 1 reply      
I've pretty much faced all of the problems mentioned in this article. An application is set of services with a certain complexity that work together. In some cases this complexity just gets transferred from service level to orchestration level.

Don't get me wrong, I love microservices and will try to use them in all my future work, but I think where people often go wrong is that they over commit to it without realising the downsides. I often see people casually using AMQP queues for pretty much everything, only when they need the worker service to talk back to the originating service that they realise they've made a wrong architectural decision.

blablabla123 1 hour ago 1 reply      
Why is highavailabilty.com slower than most other websites? :>
cs02rm0 7 hours ago 4 replies      
I was prepared to be convinced, what in our field does bring nothing but benefits? I didn't see much to worry about here though;

* who cares if you're deploying 20 services or 1 when you run your deploy script? * who doesn't have an operationally focused dev team? * who isn't already dealing with interfaces even if not micro services? * since when do micro services have a monopoly on distributed architectures or asynchronicity?

I can't agree testing is harder either. Too much fluff in this article to read much value into it.

programminggeek 1 hour ago 0 replies      
Fundamentally, there isn't a huge difference between microservices and monotlic apps if both systems do the same thing. However, a lot of the issues of general complexity are more apparent in one approach or the other.

For example, breaking changes are breaking changes in either system. It's not an issue of architecture style, it's a matter of business needs changing, and thus protocols change. A change in protocol breaks the existing protocol.

We understand this intuitively when you are talking about a de-facto protocol like HTTP, but we seem to think our own programs are somehow different. They aren't.

Architecture is about taking the essential complexity of a problem and creating components and protocols to solve a problem in a way that makes the most sense for the team trying to solve it. Monolithic apps or microservices then should be a question more of what your team is going to execute well as much as it is a question of which structure more elegantly solves the problem at hand.

DanielBMarkham 5 hours ago 1 reply      
I'm really happy to see DevOps, CD, and Microservices take off, although, as usual, I'm a little perplexed as to why we need new slang. But hey, if it works to get the message out, I'm all for it.

These things are not new, but like so many other ideas, they're just old ideas re-appearing in a context that had forgotten about them.

Traditionally, many of the potential problems the author relates have been solved by architectural conceits. For instance, standardize on a programming language and datastore, then share all persistence-related files. (I'd strongly suggest a FP language, preferably used in a pure manner). Then you've decreased the "plumbing" issues by a couple orders of magnitude, lowered the skill bar for bringing in new programmers, and you can start talking about using some common testing paradigms to work on the other issues.

I'm a huge fan of microservices, but it's good to talk about the bad parts too, lest the hype overrun the reality.

The future of jobs: The onrushing wave
95 points by bootload  17 hours ago   54 comments top 15
josephg 5 hours ago 2 replies      
There's a great 15 minute video by CGP Grey introducing the same set of concepts that technology will contribute to unemployment:https://www.youtube.com/watch?v=7Pq-S557XQU

Its a bit pop-sciency, but its very approachable and well explained.

grandalf 2 hours ago 1 reply      
If unemployment ever becomes a problem we can simply outlaw the wheat combine, which will open up hundreds of thousands of jobs harvesting grain manually.
SideburnsOfDoom 6 hours ago 4 replies      
They estimate in their chart that Athletic trainers have a 0.007 chance of being automated, the third lowest chance after therapists and dentists.

it's hard to know exactly what they class as "Athletic trainers". For Olympic athletes, there will always be staff. For the rest of us, what would that "unlikely" automation of the coach look like? Perhaps it would start with a movement sensor on the wrist? A smartphone app to track and recommend exercise?

That's not looking unlikely to me, it's looking like it's already here in plain sight:




michaelochurch 32 minutes ago 0 replies      
I don't blame technology for the stagnation of the American middle class. I blame poor leadership, but I think there's something else going on that isn't getting a lot of press: latency skew.

Instead of waiting two to three days for a piece of postal mail, we're annoyed if that email takes two minutes. I'm not going to moralize about "instant gratification" as if it were wrong, because it's mostly not conscious and it's not a moral issue; we're just being neurologically retrained to resist delays. From a website, 10 seconds means "never": it's down, or in an unusable state. We're also (some of us, at least) at a ridiculous level of comfort; we have people who program their garages to heat up 30 minutes before they leave for work, because they can't stand the thought of 45 seconds' exposure to winter cold.

What's not becoming instant is human learning. If something can be learned quickly and will become rote, we can now program a computer to do it. So the things that we need humans for tend to be those that require subtlety or experience. That hasn't gotten faster. It still takes 6+ months before someone is good at his job. That's not a new problem. It's just less tolerated because people are more primed to expect instant results. So we're seeing an aversion to training people up into the better, more complex jobs that technology creates.

desdiv 3 hours ago 0 replies      
The chart claims that economists have a 43% chance of being replaced by automation.

If they're wrong, then they'll be replaced by algorithms that are better at predicting economic trends than they are.

If they're right, then...

danmaz74 5 hours ago 3 replies      
Very interesting analysis. But they don't consider at all one very relevant change that happened during the last 30 years in most wealthy economies, especially the USA: That the taxation of the rich and the highest-earners in general was massively lowered. This contributed a lot to the concentration of wealth and capital in fewer hands.
jfoster 4 hours ago 3 replies      
The interesting thing is that a wave of redundancy due to automation or unprecedented efficiency contains problems that those made redundant might be able to work on.

Eg. Joe can't afford X due to lack of employment, and Bob can't afford Y for the same reason. Assuming Bob can supply X and Joe can supply Y (or some more complicated network of bartering), it seems like should be possible for an economy to arise.

moonchrome 3 hours ago 0 replies      
>eventually full AIs

At that point there will be no corporations or anything else related to our society because it will break the premise society is built on - that we are better off working with each other - someone in control of such AI doesn't need anyone else - it's impossible to guess what they would do with it but a logical choice would be to eliminate potential threats (ie. anyone who can develop similar technology or compromise it)

Once someone kicks off a "seed AI" that can develop/replicate fast enough it's game over for the rest, they win. And note that by AI I'm not talking about Terminator style self-aware machines - I'm talking about a problem solving device capable of performing given tasks.

tokenadult 1 hour ago 0 replies      
I see the dateline of this interesting submission (which looked familiar) is "Jan 18th 2014 | From the print edition." There seems to be a whole spate of year-old submissions to Hacker News this weekend. Are we trying to do a year in review, to see which predictions of 2014 turned out in reality?

As for the substance of this interesting submitted article, the historical facts are reviewed in a key paragraph before the article goes off into speculation about the future: "For much of the 20th century, those arguing that technology brought ever more jobs and prosperity looked to have the better of the debate. Real incomes in Britain scarcely doubled between the beginning of the common era and 1570. They then tripled from 1570 to 1875. And they more than tripled from 1875 to 1975. Industrialisation did not end up eliminating the need for human workers. On the contrary, it created employment opportunities sufficient to soak up the 20th centurys exploding population. Keyness vision of everyone in the 2030s being a lot richer is largely achieved. His belief they would work just 15 hours or so a week has not come to pass." The nub of the article's argument is that new forms of technological change might not leave us with any new forms of gainful employment.

After its interesting text discussion and chart predicting what kinds of employment are least likely to be automated out of existence, the article points out one difference between the world of the past and the world of today: "Another way in which previous adaptation is not necessarily a good guide to future employment is the existence of welfare. The alternative to joining the 19th-century industrial proletariat was malnourished deprivation. Today, because of measures introduced in response to, and to some extent on the proceeds of, industrialisation, people in the developed world are provided with unemployment benefits, disability allowances and other forms of welfare. They are also much more likely than a bygone peasant to have savings. This means that the 'reservation wage'the wage below which a worker will not accept a jobis now high in historical terms. If governments refuse to allow jobless workers to fall too far below the average standard of living, then this reservation wage will rise steadily, and ever more workers may find work unattractive. And the higher it rises, the greater the incentive to invest in capital that replaces labour." Indeed, it may be that the funding of governmental benefits will become secure enough through rising productivity that many current workers will have children who do not need a job at all.

doctorstupid 3 hours ago 2 replies      
Of course, the other side of the coin says that there are too many people.
j1o1h1n 4 hours ago 0 replies      
Decent sci-fi reading list in the sub-headings.
SixSigma 2 hours ago 0 replies      
Ftee trade destroys wages, of this there is no secret.

De-regulation subverts democracy.

Time to revisit the relationship between capital and well-being. Ricardian theories of comparative advantage drive wealth into the hands of those who control capital, not into the calloused hands of the poor suckers who sweat.

jokoon 1 hour ago 0 replies      
the domain of computer science that seems to have a bright future is machine learning.
aaronhoffman 2 hours ago 0 replies      
I'd recommend reading Economics in One Lesson http://mises.org/library/economics-one-lesson
DanielBMarkham 2 hours ago 0 replies      
Science is by definition the creation of causal chains. If I do X in this system, Y will result. Hence the importance of falsifiability and reproducibility.

This is yet another in a series of economic articles that sound much more like a typical op-ed column than an observation, hypothesis, or proof.

Adblock Plus is probably the reason Firefox and Chrome are such memory hogs
696 points by lelf  1 day ago   371 comments top 85
SwellJoe 19 hours ago 6 replies      
I use AdBlock Plus not because I don't want to see ads...I actually don't mind ads. I use it because I don't want auto-starting audio, ever. I have 30-60 tabs up at any given time. When an ad starts playing audio, it disrupts my entire workflow, possibly disturbs my partner sleeping next to me, etc.

In short, auto-start audio in ads is quite simply so far outside of what I consider acceptable behavior, that I'm willing to burn the whole goddamned business model to the ground to stop it. I disable AdBlock Plus for sites that I know to behave responsibly with regard to their ads (reddit, probably a couple of others). If there were a list of Advertising Good Citizens who never use auto-starting audio ads (such a list would probably need to demand a few other things, like a good privacy policy, no popups/popunders, etc., but audio is the single reason I installed AdBlock Plus), that could be dropped into AdBlock Plus, I would happily use it. I don't mind ads, but the second somebody disrupts my work, my conversation on Skype, my partner's sleep, my music listening, etc. is the second I grow to hate the site and the advertiser.

MicroBerto 1 day ago 7 replies      
People aren't using Adblock to conserve resources. They're using it to block ads and most are willing to take a performance hit to do so.
neals 1 day ago 12 replies      
I used to not do Adblock, because you know, ads make the internet go round and all that.

But with the "download here"-button ads, there is no way to know what button to press anymore. Now installing Adblock is a requirement.

elorant 1 day ago 2 replies      
Id rather pay a few dozen bucks to buy more RAM than running the risk of been infected by malware served from compromised ad servers. Not to mention the significant decrease in aesthetics/usability when pages include a dozen different ad areas.
anon4 1 day ago 4 replies      
I see nobody has posted it, so here is a very lightweight alternative - it redirects all known ad-hosting sites to


ben0x539 19 hours ago 1 reply      
Sounds like if Mozilla wants to improve Firefox memory usage, they should work on blocking ads natively so people don't need Adblock Plus.


TeMPOraL 23 hours ago 5 replies      
> In Nethercotes testing, he found that TechCrunch used around 194MB of RAM without ABP enabled

194MB for a single webpage that should mainly be text communicating a message. Does anyone else than me find this crazy?

alopecoid 1 day ago 2 replies      
A bit off topic, but could someone explain to me why it's so difficult for a browser (or extension) to effectively block 100% of all pop-ups/pop-unders? I realize that these account for only a fraction of ads, but they are really annoying and it seems that these should be the easiest to detect; doesn't this essentially boil down to a few specific API calls? For the few cases that a pop-up/pop-under is legitimate (really, are there any?), I'd be fine whitelisting these on a case-by-case basis.
brazzledazzle 15 hours ago 2 replies      
There's a couple inaccurate things in that article that stood out at me. It should be pretty easy to measure the CPU hit when running ABP. It looks like the author is running windows so he should be able to use perfmon. He also mentions that "As with all proxy servers, though, the one caveat is that it doesnt work with HTTPS connections" which is very much wrong.
shmerl 11 hours ago 0 replies      
Didn't ABP+ developers plan to address this? How is that effort going?

UPDATE: Here is the bug to track: https://bugzilla.mozilla.org/show_bug.cgi?id=988266

currysausage 1 day ago 2 replies      
My current favorite solution for blocking the more annoying ads and increasing security while preserving a low-latency browsing experience: Click-to-play for Flash Player (and all the other plug-ins).

There are lots of gray boxes now all over the web, but I prefer them over resource-hungry attention-grabbing Flash ads.

avinassh 6 hours ago 0 replies      
I use my Raspberry Pi as ad blocking proxy and I don't have to worry about performance usage on my machine at all! Some routers allow have this option built in and also you can block IPs of common ad publishers.

[0] - https://learn.adafruit.com/raspberry-pi-as-an-ad-blocking-ac...

dendory 1 day ago 2 replies      
What kind of fucked up site has 10+ iframes per page, let alone hundreds? And the example site he uses takes 530megs to load even without addons. I'm sensing the issue is somewhere other than Adblock Plus.
jervisfm 5 hours ago 0 replies      
I realized that Adblock plus was a performance hog a long time ago and it is good to see that information being shared widely. The issue especially impacts those who have many many tabs open all at once.

Interestingly, there is a uBlock alternative which I had not heard about[1]. If the more efficient claims are true, it would definitely be worthwhile to switch over to that instead.

[1] https://github.com/gorhill/uBlock/wiki/%C2%B5Block-vs.-ABP:-...

pmontra 7 hours ago 0 replies      
It's unfortunate and maybe it can be engineered in a better way but still I'm happy to trade CPU and RAM for uncluttered web pages. Whenever I watch somebody navigate without AdBlock, I always wonder how they could get used to make sense of all that mixed mess of ads and content.

Furthermore if sites are paid per click my pre AdBlock contribute to all the ads sustained web sites was probably way less than $1. They are losing little by my use of AdBlock. One reason I didn't click ads was precisely that they are so intrusive and ugly. Thanks to AdBlock I'm spared with that and they don't lose bandwidth and CPU to serve ads that won't be clicked. Win-win.

sinemetu11 1 day ago 1 reply      
I use a modified hosts file instead which is much better. See http://someonewhocares.org/hosts/ Remember the web is pull not push.
thibauts 4 hours ago 0 replies      
I made a host filtering proxy with node a while ago to solve exactly this problem. https://github.com/thibauts/node-host-filtering-proxy

Edge cases are not completely ironed out but if people want to help perfect it I'll very happily accept PRs !

ddorian43 1 day ago 1 reply      
Isn't using 200MB for a webpage(techcrunch) actually bad in the first place ?

Do all designers work on their iExpensiveMachine and don't care about other peoples lower-end machines/phones/tables/whatever?

jacquesm 1 day ago 2 replies      
He's definitely on to something here, FF with a hundred or so tabs open:

With ABP:

   12961 XXXXXX    20   0 2615960 1.483g  58692 S  45.2  9.5   2:13.00 firefox 

   13098 XXXXXX     20   0 2157212 1.109g  57476 R 142.6  7.1   1:26.25 firefox                              
Both after a complete stop and start of the browser, restoring every tab by activating it.

ghantila 1 day ago 3 replies      
I uninstalled right after hearing from Mozilla [1] about the effects of AdBlock Plus on Firefox. Since, then I'm blocking things manually.

First I was using this [2] hosts file by Dan Pallock. But now I've switched to this Neocities [3] hosted site. I don't know who manages this site, but the entries are uniquely sorted from various sources, which includes entries from Dan Pallock's hosts file.

Apart from this, I use Privacy Badger [4], Self-Destructing Cookies [5], HTTPS-Everywhere [6] and Disconnect [7].

[1] https://blog.mozilla.org/nnethercote/2014/05/14/adblock-plus...

[2] http://someonewhocares.org/hosts/

[3] https://hosts.neocities.org/

[4] https://www.eff.org/privacybadger

[5] https://addons.mozilla.org/en-US/firefox/addon/self-destruct...

[6] https://www.eff.org/https-everywhere

[7] https://disconnect.me/

arthurk 1 day ago 0 replies      
I've been using http://someonewhocares.org/hosts/ for almost a year now and it works very well. No need for adblock.
legulere 1 day ago 1 reply      
I don't even use an adblocker. I simply use ghostery to block trackers and almost no ads being there anymore is a nice side-effect.
Animats 21 hours ago 1 reply      
Maybe I should write one.

I wrote AdLimiter, (adlimiter.com) which eliminates certain ads from Google search result pages. It's really a demo for our site rating service, SiteTruth,but it has the internal machinery to find and remove ads. The basic idea is that it finds links to sites in the DOM, and works outward in the DOM to find the ad boundary. Then, if analysis of the link indicates the ad should be deleted, the offending section is deleted from the DOM. It takes one linear pass through the DOM to find ads. If the page changes, five seconds after the changes quiet down, another pass is made. This approach is reasonably general and requires little maintenance.

I once looked at AdBlock's code. Internally, AdBlock makes heavy use of regular expressions and does a lot of searching. It seems to be doing more work per page than should be necessary.

In some ways, Ghostery is more useful than AdBlock. It blocks most trackers, which reduces network I/O. Some ads disappear once their tracker is disabled. (CBS TV shows play without commercials if Ghostery is running.)

I've been toying with the idea of an ad blocker that uses simple machine learning. All content coming from off-site links gets a light grey overlay. If you click on the grey overlay, it disappears so you can view the content, and the off-site link is rated as less spammy. If you ignore the overlay, the off-site link is viewed as more spammy. The grey overlays gradually get more opaque over areas where you never remove them, and when they go fully opaque, the covered content is deleted completely.

efbbbf 1 day ago 0 replies      
panzi 13 hours ago 0 replies      
> click-to-play add-ons like Flashblock

You don't need an add-on for that. Firefox and Chrome both have this as an optional built-in feature. I think it should be on by default (which it sadly isn't).

zo1 1 day ago 2 replies      
From the article:

>"* The main problem, though, is the process by which ABP actually blocks ads. Basically, ABP inserts a massive CSS stylesheet occupying around 4MB of RAM into every single webpage that you visit, stripping out the ads.*"

Isn't then the actual problem the way that Adblock+ removes the ads? Why not simply allow an API for a plugin to easily strip content from a site. If there is one already, then ABP should switch to it to reduce memory usage. If not, well then we should look at the actual culprits of this problem which is the browsers.

jim_greco 1 day ago 1 reply      
I wish the internet was usable without it.
wyclif 16 hours ago 0 replies      
Here's what I do for Firefox and AdBlock Plus:

Under "Filter Preferences", uncheck everything marked EasyList. Under "Custom Filters" add EasyList without element hiding rules and make sure it's checked. Restart. This speeds things up considerably.

On Chromium I don't use AdBlock Plus, it's too much of a memory hog. Instead, I use the much leaner and faster Block for dynamic filtering:


userbinator 1 day ago 0 replies      
The Proxomitron (local filtering proxy) + HOSTS file + Javascript whitelisting is all I need.

Turning off JS instantly removes a ton of annoyances, effectively removes a huge attack area for browser exploits (I've accidentally infected others by sending them links to sites that had no effect on me...), and while there are certainly sites that require it (often not even of the "application" type, but just to do something that could've been done without), the majority of the ones I come across in e.g. searching for info don't need it. If a site I find when searching refuses to show anything, then I'll just go back and continue with the next search result or use Google's text-only cache which often does have the content I'm looking for in a more readable format. The whitelist is reserved for sites that are both highly trusted and absolutely necessary to enable JS on.

htilonom 20 hours ago 0 replies      
Why does the topic say Firefox when Block isn't even for Firefox? Chrome is the problem, not Firefox.

All browsers became huge memory hogs lately, but I'd rather have 100MB+ of RAM just for Adblock than blinded by ads (of which a lot are malicious).

therzathegza 21 hours ago 0 replies      
Increases the SNR of the internet so high, it's actually worth it. I'd rather spend more on memory and have a less annoying internet :).

Consider the technical solution alternatives, and getting people to use them. Now look at how easy it is to install ABP.

getdavidhiggins 15 hours ago 0 replies      

http://someonewhocares.org/hosts/HN Thread: https://news.ycombinator.com/item?id=6002544

Make sure localhost / resolves to something though. A blank `index.html` in the root is lightweight enough. For the perfnerds, you can resolve it using lighthttpd

Yizahi 21 hours ago 2 replies      
Behold the power of the Time Machine - the text straight from the 90th-00th. Everyone was like "Oh, this new Windoze uses 500 megs when idle! Then 1000 megs! Then 2000 megs! etc. etc.". Come on people, we are buying our hardware precisely for this - to be used.

Now this ET site with 28 spyware modules from random advertisers is telling me - "Don't use your CPU it will make it slower, also don't use too much RAM. Instead make your brain slower and use more brain memory for useless stuff. Because you know - think about children, and because we are entitled for our business model by birth right.".

frik 23 hours ago 2 replies      
It's a chicken egg problem.

1) Website owners use ads to earn money (server and employee costs, revenue).

2) Several Ad networks create ads that influence and annoy users and track their browser habits. e.g. blinking buggy video ads with sound that sometimes even crash the flash plugin

3) Visitors install Adblock+ because they dislike intrusive ads that don't adhere to the browser privacy settings. Some websites are unusable without an ad-blocker. Many websites even crash the mobile browser.

4) Companies buy ads cheaper as ads are clicked in reality by clicking-bots and are worth less than years ago.

4) Website owner places more ads because the earn less for the same amount of served ads as last year and because more and more block all ads altogether.

5) Even more visitors install adblockers and turn them on for all websites.

6) Website owners introduce a paywall so that parts of their content or everything is behind closed doors for paying customers only.

7) The visitors switch to alternative website that offer comparable content for free, monetized by ads or maintained as hobby.

8) and so on...

Paywalls are not the solution. Another website will appear that offers comparable content for free. Example: The Microsoft Network (MSN 1) that was meant to replace the WWW in 1995 and shipped with Win95. Bill Gates had to rewrite his vision of the information highway, micro payments and Microsoft Network in his famous book "The Road Ahead", half a year later as MSN failed and WWW succeeded he released a second book edition that removed all references and now acknowledges the WWW. Pictures of MSN 1: http://winsupersite.com/windows-live/msn-inside-story

Everyone has to change a bit, to improve the current status:

* The ad-networks that still code flash-based ads should move to HTML5.

* The companies that buy ad-space from such ad-networks should care more about the ad-quality.

* The website owners should care more about which ad-network they select.

* The visitors should use less adblock-plugins.

* The WHATWG/W3C/browser developer should improve some situations with HTML 5.x so that there is no single reason for flash based ads.

girishso 1 day ago 1 reply      
I have switched to HTTP Switchboard, works pretty well. https://chrome.google.com/webstore/detail/http-switchboard/m...
gcb0 9 hours ago 0 replies      
if you want to get rid of intrusive ads and tracking, you should use hosts file from someonewhocares.org

because ad block won't work on mobiles, and devices owned more by Google/apple than yourself.

if you want to block malicious scripts and plugins you should use noscript.

both will actually improve memory use and performance, not to mention getting closer to solving the problem in the first place

ChrisGranger 20 hours ago 0 replies      
My computer is a Windows Vista-era relic with 2 GB of RAM and I rarely have any trouble browsing unless I've got loads of tabs open (which would undoubtedly gobble all of my memory regardless).

I'm using RequestPolicyContinued. Using a whitelist for third-party requests seems to do a lot of the heavy lifting so that my Adblock Plus filters are limited to a handful of specific things I don't want to see.

0dmethz 1 day ago 3 replies      
Use AdBlock. Not AdBlock Plus.

Remember: ABP = Shit. AB = Good.

matkam 23 hours ago 0 replies      
sumitviii 18 hours ago 0 replies      

Now that's real irony.

mavsman 17 hours ago 0 replies      
Totally worth any memory issues it causes. I jumped on Safari the other day without it and it was awful.
_greim_ 21 hours ago 0 replies      
I don't mind ads most of the time. I have a script that rips out iframes, flash and other junk, and does a few other "cleansing" operations. It works fairly well and it's satisfying to push the "zap" button when a certain page is getting particularly annoying, and see all those rectangles become suddenly blank.
gondo 1 day ago 1 reply      
what about forking chromium and integrating adblock directly in the browser? +removing all the google nonsense (like disabled cross domain security for some google domains
joeyyang 19 hours ago 0 replies      
I was recently having a ton of memory issues with Chrome without Adblock installed -- turns out a lot of Chrome extensions are memory hogs.

A quick `ps aux` will let you see what extensions are the worst offenders; clearing out a bunch of unused extensions helped restore my computer back to normal.

jordanpg 10 hours ago 0 replies      
Has anyone seen analysis that covers Adblock in addition to Adblock Plus? Do the same findings still obtain?
Walkman 1 day ago 1 reply      
There was a very good discussion [1] about this 8 months ago when I realized the same [2].

I suggest try HTTP Switchboard instead (see the previous thread).

[1]: https://news.ycombinator.com/item?id=7765758

[2]: https://news.ycombinator.com/item?id=7765904

snird 1 day ago 2 replies      
I'm using Ghostery as a privacy and ad-blocking extension. Does anyone have data on the impact Ghostery might have on the browser memory usage?
thegeomaster 1 day ago 0 replies      
I can completely agree with this. I have a shitty machine with 2GB RAM and it's more often than not that it freezes almost completely when I compile bigg(ish) projects because it hits swap. I keep my Firefox open all the time (who doesn't?) and since I disabled AdBlock Plus this wasn't an issue anymore, and from 1.2GB Firefox's resident set size has plummeted to 500MB for me.
mmgutz 1 day ago 6 replies      
Noscript is not an option anymore. Too many sites rely on javascript. I'll take the memory hit every time to use adblock.
pmelendez 1 day ago 0 replies      
I do not use and Ad Blocker and Firefox and Chrome always devour my RAM. I can't imagine how worse would be with a Adblocker
linuxhansl 16 hours ago 0 replies      
I just maintain my own small custom list, all automatic lists are disabled. Ads are OK unless they are animated, IMHO. Flash is totally out of course. In fact I think I am doing these advertisers a favor since I do keep a mental list of who not to buy from, that list includes folks who have animated and flash ads (and those who advertise on Fox News, but that is a different story).

So I usually just block ads that are really obnoxious. Every few months I review the list and remove any entry that was filtered less than 10 times or so.

That has two advantages. First I get to control what is filtered, and second it prevents that list from growing so much.

101914 1 day ago 0 replies      
I use DNS to block ads when control over the HOSTS file is not available to the user.

Works across all applications, e.g. browsers, apps, and requires no installation of third party software e.g. extensions.

I'm curious: do these ad blocking "solutions" operate as businesses? Do ad servers pay to be removed from blacklists? Pardon my ignorance.

sly010 1 day ago 0 replies      
Isn't there a way for adblock+ to insert just the right set of css rules based on the page it is inserted to? It's not like every site displays ads from every possible ad network. I don't know the internals of AB+ so correct me if this is impossible for some reason.
mattmurdog 14 hours ago 0 replies      
Why would this even matter with the type of machines we have today? I'm sure your 8gigs can handle a little Ad Block.
st3fan 1 day ago 0 replies      
You can remove the 'probably'. It is pretty well known that Adblock is a mayor memory hog.
psykovsky 1 day ago 0 replies      
I prefer the hosts file way of redirecting ad/tracking domains to localhost. There are some nice hosts lists already compiled, which you can download to make your own. The resources problem simply went away when I started doing it this way.
RRRA 1 day ago 0 replies      
Some ad blocking should definitely be built in for performance reason then ...
cm3 22 hours ago 0 replies      
For blocking hosts you can also use https://opensource.conformal.com/wiki/Adsuck
agumonkey 1 day ago 0 replies      
I often send URL to printfriendly, most of the time it's light, readable and ads-free. Works with Firefox keyword tags (added manually) and with Chrom* urlbar (added automatically after a few visits).
philip1209 22 hours ago 1 reply      
If we can curate a list of ad domains, then adding them to the hosts file and null-routing the DNS would both remove ads and be super quick
atoponce 1 day ago 0 replies      
It's nice, but it seems to be a bit overly aggressive. As an example, it's removing the "Share to Twitter" button on my TT-RSS install.
paulhauggis 23 hours ago 1 reply      
So, with all the people here doing Adblock. Should I feel bad when startups go bust because of the lack of advertising dollars?
mbrock 1 day ago 0 replies      
I once did some profiling on a heavy single page app and found that the overhead of AdBlock+'s extreme amount of CSS selectors was a major culprit.
dreadfulgoat 17 hours ago 0 replies      
I just realized that getting something like AdBlock as a transparent default in major browsers is something that could actually kill Google as we know it.

A lot hinges on modern auto-update pipelines.

XorNot 17 hours ago 0 replies      
This doesn't explain how Chrome, on a PC which is just serving a simple billboard page with some dynamic Jquery to update it, ends up using 4gb of RAM after a few hours despite no Javascript memory leaks.
foxhill 1 day ago 0 replies      
being a.. "tech" person, i always get grief from people when i tell them i don't use adblock. "you should know better" or words to that effect.

this would explain why i've never considered chrome to be a memory hog.. (although, all my machines have a lot of memory, too)

lukastsai 1 day ago 0 replies      
a mobile readable version:https://getscroll.com/r/mdqbs

I'm using https://github.com/gorhill/uBlock to replace Adblock plus since last week.

pwr22 1 day ago 0 replies      
This is already shown to be the case with FF, ABP uses a lot of memory per tab
bourbon 1 day ago 0 replies      
I use Ghost and Adblock+ with Chromium, and I've never had an issue with memory.
jmgtan 1 day ago 0 replies      
I've been using GlimmerBlock for quite sometime without running into issues
ins0 23 hours ago 2 replies      
yeah iframes on every modern webpage...a problem is the adblock injection and it cost of curse. but please don't blame iframes for this...no one use iframes anymore accept for ads
sandsand 18 hours ago 0 replies      
I seems an overstatement to me. I would say it's misuse of iframes and only then adblock.
seoguru 20 hours ago 1 reply      
handy trick: install "incognito this tab" https://chrome.google.com/webstore/detail/incognito-this-tab... if adblocker is interfering with page, right mouse click and bring up tab in incognito mode (which doesn't load extensions unless specified).
Pxtl 1 day ago 0 replies      
I find its a hog and I don't run those.
znowi 17 hours ago 0 replies      
tl;dr "Please do not use ad blockers" Ad industry
kolev 18 hours ago 1 reply      
They are memory hogs with or without particular extensions. I leave Chrome with 10 basic tabs open and in 4 hours, it's already "using" 2GB of RAM. The new Firefox Developer Edition with "process per tab" is similar. The best resource-friendly browser on OS X is Safari.
icantthinkofone 21 hours ago 0 replies      
To be clear, Adblock Plus is a memory hog that causes issues for Firefox and Chrome users. The added memory is not a problem of either browser and they do not cause that themselves.
ddingus 23 hours ago 0 replies      
I have been running Firefox without any blocker for a long time. Didn't mind the ADS, and I thought it an easy way to fund various things.

No worries, until those video and audio ads got out of control.

Some pages would bury my machine! And it's a good machine too. 2012 Mac Book Pro.

The memory profile of Firefox itself is kind of a pig, but not something I would worry too much about. Firefox with those ads? Out of control. Firefox with simple Adblock is back to no worries again.

I'll have to give uBlock a try.

tek-cyb-org 1 day ago 0 replies      
32gb says not.
insin 1 day ago 0 replies      
> Heres a lovely bit of irony for you: Adblock Plus... is actually increasing the amount of memory used by your web browser, rather than decreasing it

Its like ten thousand spoons when all you need is a knife.

hurin 21 hours ago 1 reply      
Firefox gets slower with every update - and it's certainly not Addblock plus that's at fault (although yes, addblock plus has a very inefficient implementation.)

If you use Chromium I recommend HTTPSwitchboard.

ddebernardy 21 hours ago 0 replies      
> On a modern website, there can be dozens of iframes.

Let me fix that for you:

> On a shit website, there can be dozens of iframes.

dpweb 23 hours ago 0 replies      
This in your JS should hide site content from adblock users (adsense only):

  window.hasOwnProperty('google_ad_block') || (document.body.innerHTML = 'Please disable adblock to use this page')

Psychologists Strike a Blow for Reproducibility (2013)
64 points by jcr  9 hours ago   20 comments top 7
a_bonobo 5 hours ago 0 replies      
This article is a bit old (26 November 2013), for recent news on reproducibility check the Reproducibility Initiative.

It's a movement by several companies and groups

Nature recently joined it, so if you want to publish a paper next year with them you have to at least complete the checklist to make reproducibility easier: http://www.nature.com/nnano/journal/v9/n12/full/nnano.2014.2...

They have also started to reproduce publications of voluntaries, here's a very recent paper detailing their replication efforts: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjourna...

Article summarizing the paper: http://www.nature.com/news/parasite-test-shows-where-validat...

Fede_V 7 hours ago 2 replies      
Yeah - this is a sample of the 'gold standard' of psychology papers, and only 10/13 could be reproduced.

The reasons for shoddy reproducibility are p-value hacking, intense pressure to publish at all costs, and a premium on 'gladwellesque' results where a simple theory seemingly explains a lot.

Gellman and Uri Simon-Johnson have both written a lot about this.

alexandros 9 hours ago 1 reply      
While I'm very happy to see more attempts at replication, I am quite shocked to hear the grandstanding that "reproducability is not as much of a problem".

Seen as a meta-experiment it is incredibly weak, with a tiny, biased sample. High-profile results that have a simple enough procedure so they can be combined.

As such, it can't get even close to support what they're (whoever -they- actually is, it may not be the actual researchers) trying to claim. What about less high-profile stuff? What about more complex setups? If 20% of your most rock-solid results are not reproducible, I wouldn't be so quick to celebrate. Imagine if 20% of basic physics or maths results weren't actually true...

darkxanthos 26 minutes ago 0 replies      
All of their data and methods are all online here: https://osf.io/ebmf8/

This is a very well done study and almost every criticism I've read in the comments is addressed in it if you read the write up.

UhUhUhUh 1 hour ago 1 reply      
<rant> And I think that psychology began to die when it became obsessed with statistics.Researching the mind has become an endless, bottom-up process with very few ideas and zero grand idea. I'm not even talking about practice...Being a psychologist myself, I'm bored through my skull with this illusion of objectivity (the general linear model is only a theory) and the pathetic little results it methodically crank out.The Rorschach is a test with very low statistical properties. I love it, it is useful. And I can prove it one case at the time.That's why I am learning computing and leave the tedious gravy train of meta-analyses, rotated factor-analyses and manualized, empirically validated methods behind me.</rant>
riffraff 8 hours ago 1 reply      
This could be equivalently titled "20% of the most well known psychology results are impossible to reproduce".

Also, among the ones that did, there are the Kahneman ones, and _he_ was the one to point out that most experiments are never reproduced so there was a higher chance that his results would be reproducible.

tokenadult 3 hours ago 0 replies      
The submission is from 26 November 2013. (I read the article when it was first published and have read related articles about the Many Labs Replication Project before.) The article kindly submitted here is by experienced science writer Ed Yong and links to some helpful background reading, including his report about Daniel Kahneman's open letter from 2012.

The Journal of Open Psychology Data published findings of the replication study mentioned here,[1] and the PsycNET site of the American Psychological Assocation provides a citation to the published version of the study findings in a psychology journal.[2] Improving replicability in science is an ongoing effort not just in psychology but in most branches of science, and is critically important in medical studies.

[1] http://openpsychologydata.metajnl.com/article/view/jopd.ad/1...

[2] http://psycnet.apa.org/journals/zsp/45/3/142/

The Secret to Learning a Foreign Language as an Adult
19 points by kintamanimatt  1 hour ago   30 comments top 11
fivedogit 17 minutes ago 4 replies      
I've said this before on HN, but it's worth repeating. I want a Warcraft-style MMO that immerses me in a foreign language. Instead of "Go kill 8 dragons" the questgiver tells me to "Mata a 8 dragones verdes" and it's up to me to figure out he said green dragons, not just any dragons.

I would play the hell out of that game and I'd be fluent in like 8 languages after a few months. Please, somebody with access to like $10 million of funny money, build this.

readme 41 minutes ago 0 replies      
Learn programming in 24 hours!

This is just fluff. Sure, the guy probably learned some French, but it's anecdotal. To assess proficiency he needs to be tested by native experts in all the modalities of language: speaking, listening, reading, and writing. Not by the opinion of some girl who probably liked him at a coffee shop.

Where I am we take 6 months to get a student from 0 to basic proficiency in french: i.e. able to read/listen to news and discuss advanced topics, like economics. That's with 6 hours of class a day, M-F all with native speakers.

If you really want to learn a language efficiently, I'd recommend this ted talk: https://www.youtube.com/watch?v=d0yGdNEWdn0

BTW, I doubt his 6 month mark applies to harder languages like chinese.

seanoliver 37 minutes ago 2 replies      
I wouldn't call this the "secret" to learning a foreign language as an adult. Obviously immersion in a language is ideal, and the fact that he had already learned Spanish is even more ideal.

I'm currently 2 years into my attempt to learn Chinese and I can say that many of these tips don't apply to all languages:

1. Listening to music won't help with comprehension of tonal languages like Chinese because songs will usually ignore the tones so that they sound better set to music.

2. Reading children's books in character-based languages like Chinese will only be helpful if you're already proficient in a few hundred basic characters. Since there's no alphabet, there's no way to sound out words the way we can in English.

Otherwise, there are some great tips here. I agree that listening to classroom discussion and hearing others' mistakes is a great way to learn. It's also important to do daily, focused practice in the mornings when your mind is fresh and not muddled by other things.

Overall I think it's a good Quora answer but not necessarily a "secret" to learning a language.

dghughes 5 minutes ago 0 replies      
Calling it a 'foreign' language sounds odd to me.

As a Canadian French is not foreign as in not a language of this nation even though I don't speak it. Maybe foreign to the person not to your nation is what's meant.

Even in the US Spanish wouldn't be even though English is the only unofficial official language. Even French is part of US languages from parts of Maine to my Acadian neighbours who went to Louisiana.

hackerboos 35 minutes ago 1 reply      
Let's face it learning a foreign language is one of those things that everyone would like to do but are either too busy, or not committed enough to do it.

Similar to the fitness industry, those 2 barriers have spawned an industry of 'learn in your car', 'French in 30 days' and videos by polyglots who sell the idea that language acquisition is easy.

Now I'm going to tell you the hardest part of learning a foreign language.

There are no shortcuts. It takes time, it takes dedication and it will most probably cost money.

vidanay 52 minutes ago 3 replies      
Can we now see a similar article for how to learn a foreign language in 17 days while commuting 30 minutes each way to a 8 hour job 5 days a week and while raising a child or two?

Seems like just about anyone can learn to speak French when given the opportunity to immerse themselves in a bucolic French village with nothing to do!

tokenadult 2 minutes ago 0 replies      
I've been developing a FAQ on language learning as this interest is mentioned on Hacker News from time to time. The article kindly submitted here mentions learning French in France by a (native?) speaker of English who had previously learned Spanish. All of those are Indo-European languages, more or less cognate with one another. I've taken on some tougher language-learning challenges over the years. As I learned Mandarin Chinese up to the level that I was able to support my family for several years as a Chinese-English translator and interpreter, I had to tackle several problems for which there is not yet a one-stop-shopping software solution.

I hope the FAQ information below helps hackers achieve their dreams. For ANY pair of languages, even closely cognate pairs of West Germanic languages like English and Dutch, or Wu Chinese dialects like those of Shanghai and Suzhou, the two languages differ in sound system, so that what is a phoneme in one language is not a phoneme in the other language.


But a speaker of one language who is past the age of puberty will simply not perceive many of the phonemic distinctions in sounds in the target language (the language to be learned) without very careful training, as disregard of those distinctions below the level of conscious attention is part of having the sound system of the speaker's native language fully in mind. Attention to target language phonemes has to be developed through pains-taking practice.


It is brutally hard for most people (after the age of puberty, and perhaps especially for males) to learn to attend to sound distinctions that don't exist in the learner's native language. That is especially hard when the sound distinction signifies a grammatical distinction that also doesn't exist in the learner's native language. For example, the distinction between "I speak" and "he speaks" in English involves a consonant cluster at the end of a syllable, and no such consonant clusters exist in the Mandarin sound system at all. Worse than that, no such grammatical distinction as "first person singular" and "third person singular" for inflecting verbs exists in Mandarin, so it is remarkably difficult for Mandarin-speaking learners of English to learn to distinguish "speaks" from "speak" and to say "he speaks Chinese" rather than * "he speak Chinese" (not a grammatical phrase in spoken English).

Most software materials for learning foreign languages could be much improved simply by including a complete chart of the sound system of the target language (in the dialect form being taught in the software materials) with explicit description of sounds in the terminology of articulatory phonetics


with full use of notation from the International Phonetic Alphabet.


(By the way, the International Phonetic Alphabet was invented by language teachers in Europe to help native speakers of English learn French and native speakers of French learn English, so it could help the author of the article submitted to open this thread. The International Phonetic Alphabet was eventually extended to be useful for writing down any human language.) Good language-learning materials always include a lot of focused drills on sound distinctions (contrasting minimal pairs in the language) in the target language, and no software program for language learning should be without those. It is still an art of software writing to try to automate listening to a learner's pronunciation for appropriate feedback on accuracy of pronunciation. That is not an easy problem.

After phonology, another huge task for any language learner is acquiring vocabulary, and this is the task on which most language-learning materials are most focused. But often the focus on vocabulary is not very thoughtful.

The classic software approach to helping vocabulary acquisition is essentially to automate flipping flash cards. But flash cards have ALWAYS been overrated for vocabulary acquisition. Words don't match one-to-one between languages, not even between closely cognate languages. The map is not the territory, and every language on earth divides the world of lived experience into a different set of words, with different boundaries between words of similar meaning.

The royal road to learning vocabulary in a target language is massive exposure to actual texts (dialogs, stories, songs, personal letters, articles, etc.) written or spoken by native speakers of the language. I'll quote a master language teacher here, the late John DeFrancis. A few years ago, I reread the section "Suggestions for Study" in the front matter of John DeFrancis's book Beginning Chinese Reader, Part I, which I first used to learn Chinese back in 1975. In that section of that book, I found this passage, "Fluency in reading can only be achieved by extensive practice on all the interrelated aspects of the reading process. To accomplish this we must READ, READ, READ" (capitalization as in original). In other words, vocabulary can only be well acquired in context (an argument he develops in detail with regard to Chinese in the writing I have just cited) and the context must be a genuine context produced by native speakers of the language.

I have been giving free advice on language learning since the 1990s on my personal website,


and the one advice I can give every language learner reading this thread is to take advantage of radio broadcasting in your target language. Spoken-word broadcasting (here I'm especially focusing on radio rather than on TV) gives you an opportunity to listen and to hear words used in context. In the 1970s, I used to have to use an expensive short-wave radio to pick up Chinese-language radio programs in North America. Now we who have Internet access can gain endless listening opportunities from Internet radio stations in dozens of unlikely languages. Listen early and listen often while learning a language. That will help with phonology (as above) and it will help crucially with vocabulary.

The third big task of a language learner is learning grammar and syntax, which is often woefully neglected in software language-learning materials. Every language has hundreds of tacit grammar rules, many of which are not known explicitly even to native speakers, but which reveal a language-learner as a foreigner when the rules are broken. The foreign language-learner needs to understand grammar not just to produce speech or writing that is less jarring and foreign to native speakers, but also to better understand what native speakers are speaking or writing. Any widely spoken modern language has thick books reporting the grammatical rules of the language,








and it is well worth your while to study books like that both about your native language(s) and about any language you are studying.

A special bonus for learners of French (which I have used) is that many classic French literature books (novels, collections of short stories, collections of essays, etc.) are now in the public domain, and are available as free-of-charge ebooks. You can practice a lot of reading French with resources like that, and relearn classic tales you knew in youth. Similarly, today there is boundless free audio, for example in the form of online movies and streaming news broadcasts, in all of the major world languages. Take advantage of that as you learn.

Bonne chance.

wodenokoto 43 minutes ago 0 replies      
This is in no way a secret to learning anything. Almost nobody will be able to learn anything from such an intense workload of new things.

Yes, some people can work hard and retain things throughout. Beethoven could remember an entire symphony from listening to it once, but that hardly translate to anything useful for normal music students.

simonswords82 33 minutes ago 1 reply      
Yet another "article" that is just a copy/paste of a Quora post. Seems to be happening with more frequency.
houshuang 48 minutes ago 1 reply      
Learning French while being fluent in Spanish, not very impressive.
sparkzilla 32 minutes ago 0 replies      
Step 1: Get a girl/boyfriend who is a native speaker.
Interactive Programming in C
85 points by luu  17 hours ago   8 comments top 5
anon4 5 hours ago 1 reply      
I know the Visual Studio C++ debugger has edit-and-continue functionality. It would be nice to read how that is implemented, since it's obviously much more invasive than this.
listic 4 hours ago 0 replies      
Haven't anyone made a framework along these lines? The main thing I would like this framework to handle for me is cross-platform capability: shared library implementation is different on each operating system, unfortunately for this method.
tehwalrus 6 hours ago 0 replies      
After reading the first few lines of this post, I though it would be some complicated hack about function pointers.

The actual solution is much simpler (to understand and to use, I imagine) and that made me smile, nice one!

to3m 7 hours ago 0 replies      
poseid 5 hours ago 1 reply      
Nice one. Explains why many people love and use C.
Distributed Python for the Erlang Ecosystem
37 points by rcarmo  6 hours ago   3 comments top 3
IndianAstronaut 0 minutes ago 0 replies      
What are the differences between this and Disco?
jlouis 4 hours ago 0 replies      
For the record in the interest of programmers who don't use Erlang on a daily basis:

It is very common to use Erlang as a "glue" system between other subsystems in larger installations. You will often find the Erlang system controlling code written in other languages, for different reasons: The large Java subsystem which is hard to replace. The C/C++ code you run as a hidden node() in the Erlang distribution cluster. The C-accelerated function you added to the Erlang BEAM VM through dynamic loading of a .so (called a NIF). The OCaml or Go program you communicate with through a port (essentially an Erlang-controlled pipe with a proxy light-weight process inside the Erlang VM). And so on.

Some times, the right tool for the job is another system. The tool 'py' provides yet another such bridge which allows you to interact with Python programs in a very direct way.

rcarmo 4 hours ago 0 replies      
The X-15, the fastest manned aircraft ever made
56 points by jonbaer  10 hours ago   14 comments top 8
Luyt 6 hours ago 0 replies      
From the book Ignition! (http://library.sciencemadness.org/library/books/ignition.pdf)

"But something more potent than alcohol was needed for the X-15 rocket-driven supersonic research plane. Hydrazine was the first choice, but it sometimes exploded when used for regenerative cooling, and in 1949, when the program was conceived, there wasn't enough of it around, anyway. Bob Truax of the Navy, along with Winternitz of Reaction Motors, which was to develop the 50,000 pounds thrust motor, settled on ammonia as a reasonably satisfactory second best. The oxygen-ammonia combination had been fired by JPL, but RMI really worked it out in the early 50's. The great stability of the ammonia molecule made it a tough customer to burn and from the beginning they were plagued with rough running and combustion instability. All sorts of additives to the fuel were tried in the hope of alleviating the condition, among them methylamine and acetylene. Twenty-two percent of the latter gave smooth combustion, but was dangerously unstable, and the mixture wasn't used long. The combustion problems were eventually cured by improving the injector design, but it was a long and noisy process. At night, I could hear the motor being fired, ten miles away over two ranges of hills, and could tell how far the injector design had progressed, just by the way the thing sounded. Even when the motor, finally, was running the way it should, and the first of the series was ready to be shipped to the West Coast to be test-flown by Scott Crossfield, everybody had his fingers crossed. Lou Rapp, of RMI, flying across the continent, found himself with a knowledgeable seat mate, obviously in the aerospace business, who asked him his opinion of the motor. Lou blew up, and declared, with gestures, that it was a mechanical monster, an accident looking for a place to happen, and that he, personally, considered that flying with it was merely a somewhat expensive method of suicide. Then, remembering something he turned to his companion and asked. "By the way, I didn't get your name. What is it?". The reply was simple. "Oh, I'm Scott Crossfield."

dbarlett 57 minutes ago 0 replies      
NASA published a comprehensive (if a bit dry at times) ebook, X-15: Extending the Frontiers of Flight [1] that covers the development and test program. If you're ever near Dayton, Ohio, the National Museum of the USAF has X-15A-2 [2] and an XLR99 [3] on display.

[1] http://www.nasa.gov/connect/ebooks/aero_x15_detail.html

[2] http://www.nationalmuseum.af.mil/factsheets/factsheet.asp?id...

[3] http://www.nationalmuseum.af.mil/factsheets/factsheet.asp?id...

tdicola 7 hours ago 0 replies      
That reminds me, earlier this month I saw a great PBS Nova episode about Neil Armstrong's life: http://www.pbs.org/wgbh/nova/space/first-man-on-moon.html It talked a bit about the X-15 and some crazy situations Neil got into flying them, like skipping off the top of the atmosphere at supersonic speeds.
forgingahead 5 hours ago 0 replies      
Very cool. Also if you want a wider view of the content, fire up the web inspector console and enter the following:

$(".main-column").attr('class', 'large-12 columns main-column');$(".hide-for-medium-down").remove();

listic 1 hour ago 2 replies      
What did we* need such hypersonic rocket planes for and why did we stop improving them?

*we as a species

gre 7 hours ago 2 replies      
Besides the space shuttle?
One Frickin' User Interface for Linux (2003)
7 points by networked  3 hours ago   discuss
How many genetic ancestors do you have?
36 points by danieltillett  9 hours ago   7 comments top 3
SideburnsOfDoom 4 hours ago 2 replies      
> The number of genealogical ancestors you have n generations is 2n: 2 parents, 4 grandparents, 8 great-grandparents, and so forth. The only way to have fewer is if some of them are the same person.

They leave out that this effect that "some of them are the same person" is inevitable, even pervasive, past a few generations. Never mind that the number of "genealogical ancestors" rapidly becomes larger then the number of humans alive at the time, it's almost certain that your * 10 grandparents didn't travel much, so existed as part of a far smaller sheltered gene pool.

danieltillett 6 hours ago 1 reply      
I posted this because I have always wondered what the answer to this question was, but could not do the maths - it is very hard and can only really be done by simulation as Bob as done.

I suspect that the 46,000 genetic segments is an approximation, but it should not mess up the calculations too much if this is wrong. The truly amazing thing is how few real ancestors you have out of the possible number.

apaprocki 2 hours ago 0 replies      
I highly recommend using an existing DNA service (I use FTDNA) to sequence the Y chromo and/or MTDNA of your living direct relatives (where it makes sense.. only one Y for male line, etc) because the samples are kept in storage and you can use them for sequencing later as technology drastically ramps up. My Y data is available to researchers and my and my father's MTDNA are in Genbank. Genetic anthropologists use the data to fine tune our knowledge of historical population movements. The current theory is that I am descendent from the Vistula Veneti. I just wish more people had an interest so that there were more data samples. You can't just go around demanding peoples' DNA, even if it has nothing to do with the parts that influence your health.
How Colonel Sanders Became Father Christmas in Japan
147 points by gscott  18 hours ago   44 comments top 11
patio11 9 hours ago 1 reply      
For more on this topic, see The Colonel Comes to Japan, a 30 minute video with a fairly scholarly treatment of KFC's expansion to Japan in the 1970s. It is part academic study and part compulsively watchable television.

The single best anecdote among many is how the statute of the Colonel, which is outside every KFC in Japan, came to become associated with the brand here. Apparently when the guy in charge of KFC Japan was put in charge of the unit, HQ didn't really think a poor Asian country merited any marketing support, but they let him use anything they had in storage. Somebody had produced several hundred Colonel statues but they were deemed too ridiculous to use. "So I said 'I'll take all of them.'" "And did you know they would be a hit with Japanese people?" "No clue, but something is better than nothing."

(Also worth mentioning: the guy's right hand man, who was Japanese, gets into an argument with him over whether selling KFC as "aristocratic American elegance" to Japanese people is exploitative, since, quote the American executive, "We sell fried chicken to poor people." "Maybe you do over there, but we won't over here." Topics in the discussion included cultural appropriation, memories of the war and associated famines, and the desire of an emerging economic superpower to consume things associated with wealth, like meat.)

The easiest way to find the video is probably your local library or university library, as it was made in the early 1980s.

tristanho 17 hours ago 4 replies      
Probably the most shocking/interesting line from the article in my opinion:

The first Kentucky Christmas meal sold for a pricey $10 (almost $48 in 2014 money) and contained fried chicken and wine; now, KFCs Japanese Christmas meals cost about $40 and come with champagne and cake.

Natsu 15 hours ago 1 reply      
There's precedent for this in America, where Coca Cola's representation of St. Nicholas did something similar.


edent 16 hours ago 2 replies      
There's all sorts of weird cross-cultural exports in the world. I remember driving through the USA and seeing Clive Owen advertising "Three Olives London Vodka" - a brand I'd never heard of despite living close to London my entire life.

Similarly, I suspect the "Chicago Town Pizza" I see being advertised over here as a real slice of American life, would make a true Chicagoan weep.

raverbashing 10 hours ago 1 reply      
I don't really believe most of articles like this

I think most details are lost in translation, or the level of seriousness of things ("Is Mall Santa the real Santa?"/"Do Santa Actually exist?") coupled with a certain American exaggerated feelings of self-importance and self-centeredness

Or, as someone told me, a kid was told "this is not the USA, we don't celebrate Halloween!". This was in Ireland

poglet 12 hours ago 2 replies      
I took a 1 minute video showing my local KFC in comparison to the next door McDonalds. Sorry about the video quality.

I came back later that night and there was a sign outside indicating that I needed to have a phone order (maybe) to get in the queue.


KFC has made fried chicken popular on Christmas in Japan, but now you can get fried chicken everywhere, including the seven-eleven and Family Mart stores that have Christmas packages.

Someone mentioned alcohol at fast food chains. It's only at special unique fast food restaurants. Drinking in Japan is strange, for example if I live a club with a class of alcohol in my hand, the door person will pour it into a plastic cup for me to drink out on the streets.

AntonTell 17 hours ago 0 replies      
Interesting how they actually succeeded in getting such a foothold. Here in Sweden, Coca-Cola's sales have been quoted as dropping by 50% during Christmas due to Julmust being so popular during the holidays[1].

[1] http://en.wikipedia.org/wiki/Julmust#Julmust_and_Coca-Cola

dilap 16 hours ago 3 replies      
Fun article.

KFC is also really popular in Vietnam, which I was surprised to discover when I visited about 10 years ago. I think the uncanny resemblance of Colonel Sanders to Ho Chi Minh might have something to do with it.

dghughes 12 hours ago 0 replies      
If they had stuffing on the menu it would be a pretty good substitute for a Christmas dinner.
MrMan 15 hours ago 0 replies      
rasz_pl 16 hours ago 1 reply      
Wow. This is pretty depressing.

It sounds like a movie plot from


Idris: Type safe printf [video]
44 points by kenhty  11 hours ago   12 comments top 6
bronty 1 hour ago 0 replies      
You can check format strings in Java at compile-time using the Checker Framework: http://types.cs.washington.edu/checker-framework/current/che...

The format string semantics are actually pretty tricky because printf will perform conversions in certain cases.

wheaties 1 hour ago 1 reply      
This was awesome and yet one more reason to learn Idris. Now I have to see if it can be done in Scala.
ufo 3 hours ago 1 reply      
Why define Format as

    data Format = FInt Format                | FString Format                | FOther Char Format                | FEnd
instead of using lists?

     data FormatPart = FInt | FString | FOther Char     type Format = List FormatPart

wz1000 4 hours ago 1 reply      
Really cool, but I am assuming printf only works for values which are determined at compile time?
ExpiredLink 7 hours ago 1 reply      
socceroos 6 hours ago 1 reply      
My mind immediately went to Star Citizen...
Is eating behavior manipulated by the gastrointestinal microbiota?
43 points by mparramon  17 hours ago   6 comments top 2
dlss 2 hours ago 1 reply      
This is a very interesting (though not new) hypothesis. http://www.ncbi.nlm.nih.gov/pubmed/21303428 is the usual citation.

Neither paper has data how much of your dietary behavior is microbe related, which is a shame. Here's the data I have from developing a pseudo-FMT probiotic:

During initial beta testing, I found myself snacking less and forgetting to eat. The forgetting to eat bit was extremely abnormal, as I'm usually very regular about meal times. I also found myself eating breakfast for the first time since starting at college.

These and the experiences of my co-founder were sufficiently interesting that we added questions to our pilot study, where 72% of participants reported a reduction in food cravings and mild weight loss.

So the effect is much bigger than you might imagine. Here's a customer describing his rather extreme subjective experience:

"I started a total fast on Monday, and for the first 3 days only consumed water or green tea (and your probiotic), and today I only had BCAAs and a plain salad with vinegar, and so far this has been by far the easiest fast I've ever done. I haven't even gotten to the point where I feel physically hungry yet, and even my psychological desire for food is much weaker than it has been in past fasts."

If anyone here is interested in a longer discussion of health related microbiome research, http://www.generalbiotics.com/science which my co-founder wrote) provides a fairly through overview.

dschiptsov 2 hours ago 1 reply      
> Alternative hypotheses for unhealthy eating and obesity

Where is the hypothesis that states of hunger and states of anxiety are "signaled through the same mechanisms" (sorry, no time to locate all the long names) and, that the state which in other languages is called "being full" is gives "relief" from anxiety and stress. In other words, overeating is just a misuse of food as an answer to increased stress.

Binge eating is not like binge drinking of a stupid youth, it is like "aged" alcoholism of "staying stable" with a glass in each two hours. And it is much closer to tobacco smoking than to drug abuse (there is a nice course "Drugs and Brain" on coursera, btw).

Sometimes I wish I could be one of these "meme"-scientists claiming that they have found oversimplified single cause (it is microbes, like fly, stupid!) for a vastly complex, multiple-causation phenomena. Of course, microbes have some influence over host's eating behavior, via described neuro-chemical feed-back loops, but of course this is not a "single" or even the "most powerful" cause.

"Change of social norms" was a good insight - now it is OK to live this way.

The Wire in HD
273 points by danso  1 day ago   37 comments top 11
cllns 23 hours ago 0 replies      
For an example of what can go wrong: the 'Buffy the Vampire Slayer' HD Remaster has a few ruined shots.

[1]: http://www.vox.com/2014/12/12/7385261/buffy-ruined

danso 23 hours ago 4 replies      
edit/correction: As [leehro](https://news.ycombinator.com/item?id=8803190) has pointed out, Amazon has a widescreen version but not necessarily the the new HBOGO one:


I've re-watched "The Wire" about 3 to 4 times already...the new HD release is already on Amazon Prime Instant Video and I don't think I would've noticed if the news hadn't been posted. On The Wire subreddit, someone posted a few more comparisons:


If you're new to The Wire, I suggest watching the HD versions first, in spite of Simon's compelling arguments to the contrary. The Wire just has so many disorienting things about it to new watchers -- the barrage of police lingo, the high number of black to white characters, the 90s-era technological timeframe -- that the 4:3 ratio's fidelity isn't worth the mental bias you might have that associates 4:3 with "cheap" or "old"...the show can already be difficult for people to get into.

Speaking of the r/TheWire subreddit...a year ago, a sound editor did a Q&A...if you're interested about the minutiae of the show, and curious about the technical details of sound mixing is done...it's one of the best things I've ever read about the show:


bedhead 19 hours ago 1 reply      
This show is so profoundly good that I could care less about the resolution and aspect ratio. I got so lost in the show that I never would've noticed. The game is the game. Always.
raldi 20 hours ago 4 replies      
For all the ink he spends on the "crossing over" concept, I still have no idea what he's referring to. That would have been a great place to embed an illustrative clip.
modeless 23 hours ago 1 reply      
HBO should release it in HD with a 4:3 aspect ratio. It would still be a gigantic improvement over 480i.
Kiro 2 hours ago 1 reply      
I skimmed the article and watched the clips but I don't understand what the problem is. Can someone enlighten me?
protomyth 16 hours ago 0 replies      
The funny part is that the iPad brought back the 4:3 ratio. I wonder if the unconverted will be available in iTunes?
carlob 21 hours ago 1 reply      
> It vexes them in the same way that many with color television sets were long ago bothered by the anachronism of black-and-white films, even carefully conceived black-and-white films.

Is he obliquely saying that the 16:9 version of the wire is going to be like colorized black and white films?

m0nty 23 hours ago 1 reply      
I just watched all five seasons again, and the 4:3 ratio was a bit of a distraction. The quality seems lower as well, almost like I could see the scan lines.

For the video enclosed in the article: warning for spoilers and mild gore.

mxfh 19 hours ago 1 reply      
One question. Are they shifting around the center of frame or is it safe to do soft-letterboxing to 4:3 if your playback device allows for it?

Basically using the 16:9 HD version as some kind of Open Matte source material, if there are no shifts between scenes this should be trivial.


kristofferR 21 hours ago 0 replies      
I've watched the first couple of HD 16:9 episodes (thanks BATV!), and as far as I could see, they were flawless.

I was afraid it wouldn't work in 16:9, and thought I would prefer it to be kept in 4:3 like the Star Trek TNG HD remasters, but I actually prefer the new 16:9 ratio over the old ones.

A Primer on Bzier Curves (2013)
94 points by davidbarker  15 hours ago   18 comments top 6
jarek-foksa 2 hours ago 2 replies      
Are there any creative tools that would make use of quadratic Bezier curves? While cubic Beziers are easier to tweak, it is much easier to draw smooth shapes with quadratic Beziers. They seem to be also easier to implement (only one control point), not sure why major vector and pixel graphics editors don't support them.

Ideally I would prefer to have an option to draw paths with either quadratic or cubic Bezier segments, but edit paths normalised to cubic Bezier segments.

agumonkey 14 hours ago 1 reply      
This is amazingly exhaustive, and for those still hungry, there's always the Nurbs Book http://www.springer.com/computer/image+processing/book/978-3...
delinka 2 hours ago 1 reply      
Combine this with what we learned from Pixar about smoothing surfaces and it appears that Pixar has adapted the Bezier curve to 3D.

I'm always a little geeked-out by unification like this. :D

kghose 13 hours ago 2 replies      
This is a very nice tutorial. One small point. Coordinates are expressed in a "X/Y" format which was very confusing to me because I was thinking of them as ratios. It might be better to print them the commonly used tuple convention (X,Y
dclowd9901 11 hours ago 0 replies      
One of my favorite side projects was building a bezier sprite pather for Impact.js game engine (was working on a moth-balled top-down shooter).


This is one of the sites that contributed to that work. So strange to see it appear out of context here.

gopalv 13 hours ago 2 replies      
Is it some sort of coincidence that both Bezier and Casteljau were working for car companies?

Renault & Citroen feels like they were design powerhouses in this context, with numerical industrial design.

The curves were like compressed specifications, instead of being "make it look like this" with a sculpted model. That said, cars from the same era built by aerodynamics engineers also resulted in beautiful curves (like a Jaguar E Type).

The Slow Death of Do Not Track
54 points by r0h1n  11 hours ago   34 comments top 6
rgro 7 hours ago 4 replies      
The worst part about the DNT header was the requirement for the tracking companies to regulate themselves. Initially, the header was opt-in, but with the introduction of IE10, Microsoft decided that the option was going to be opt-out. Once the the DNT header was gaining traction and a not-so-small percentage of people began sending the header, the companies had no reason to comply, and the initiative sorta fell out of favor.

For blocking tracking, the most effective tools are browser extensions made to block ads. Ghostery provides comparisons on an non-biased website between the methods of blocking tracking through browser modifications [1]. According to the site, the Do Not Track header actually has an effect with a difference of 18% in cookie size when the header is set. AdBlock Edge and disabling third-party cookies results in a 59% and 40% decrease in cookie size respectively. It seems that the easiest thing you can do to lessen your internet footprint would be to disable third-party cookies and enable the DNT header, and the majority of tracking can be eliminated through the use of a browser extension. (But with the recent revelations [2], using a browser extension may actually reduce your browsing experience if you don't have the RAM to spare.)

[1] http://www.areweprivateyet.com/[2] https://news.ycombinator.com/item?id=8802424

eli 1 hour ago 0 replies      
It was a weak idea from the start. If you trust advertising companies to do what they say, then there's already an opt-out tracking system: http://www.networkadvertising.org/choices/ The bad actors (particularly ones not based in the US) were going to ignore DNT anyway.

Now, granted, it's technically far inferior to a DNT header (it sets a cookie on each ad network domain) but as far as I can tell it works and has worked for years.

thomasfoster96 6 hours ago 2 replies      
DNT was essentially dead quite a while ago.

If we are goign to get something like Do Not Track, then it should have been drafted out of the public eye, had a nice short period for public comment and then recieved some sort of backing in law. Speculative implementations didn't really help.

I'm not too familiar with the laws surrounding things like 'do not call' lists and anti-spam measures, but some sort of system from that area of law could surely have been a part of DNT.

username223 1 hour ago 0 replies      
It never made sense in the first place. It was an opt-in, voluntary restriction that destroys all of advertisers' supposed value with no legal consequences if they ignore it.

The only real solution is client-side, and we have that technology now: hosts-blocking, Ghostery, AdBlock, etc. If enough people cared, it could be enabled by default on new browser installs.

carlosrg 5 hours ago 1 reply      
This is one thing that could have been done better in Europe.
atoponce 4 hours ago 0 replies      
DNT is the product of "technology by committee". It was a disaster out the gate from the start.
Jeff Hawkins: Brains, Data, and Machine Intelligence [video]
46 points by superfx  15 hours ago   13 comments top 4
appreneur 6 hours ago 0 replies      
I was fortunate to discover Jeff Hawkins in the early 2007 , when he was exploring brain science ...I went through his lectures for entrepreneurs at ecorner.stanford.edu , being indian ,I had to literally slow down the video to understand his speech and he is like Bill gross, very very fast in articulating and highly energetic in their talk. You can feel their passion in their work.

Incidently Jeff Hawkins was the first to work on Palm software.....you can say early versions of today's smartphones.

Coming back to brain science , I somehow feel it's more to do with complexity (Santafe.edu).

I am waiting for jeff Hawkins to launch a university like santafe.edu ( complexity) which works exclusively on brain science. Perhaps that would increase application of brain science.

java-man 8 hours ago 2 replies      
I think this is the most interesting direction of research in CS one can get involved with today. In 10-20 years, half of CS graduate will be working in computational biology and the other half developing SDR-based machine intelligence.
eli_gottlieb 4 hours ago 1 reply      
Please stop talking about machine intelligence without reference to normatively correct principles of quantitative reasoning. "I tried it and it scored well in validation" just shows that you've managed to apply some kind of statistical principle, not that you've found a way to build complete machine intellect.
java-man 8 hours ago 1 reply      
What might be interesting is to stream stock market data (trades as well as company's press releases) into the Grok algorithm.People who will manage this by the next market crash will be very rich indeed.
Neo900 Crafting the private phone [pdf]
34 points by nextos  15 hours ago   23 comments top 6
mikegioia 3 hours ago 1 reply      
I'm a recent n900 convert and at this point I don't know if I can ever go back to Android/iOS. I'm eagerly awaiting the release of the neo900 and I honestly think this device is the closest we'll come to a truly open phone that you can use everyday.

I encourage anyone interested to consider supporting Joerg, Werner, Sebastian and everyone else working on this project! [1] 100 euro now and at the very least you can have a really cool device to hack around with.

[1] http://neo900.org/donate

seba_dos1 2 hours ago 0 replies      
There's also a recording from the presentation. That was my first "big" talk in English and it's sometimes hard to listen to, but maybe someone will find it useful :)https://www.youtube.com/watch?v=ahPFCFooBv0&list=PL-s0IumBit...
edwintorok 8 hours ago 0 replies      
See also http://neo900.org/news/xmas-update for a status update.
Timmmmmm 2 hours ago 3 replies      
Nice to see someone is attempting this. Physical keyboard and resistive touchscreen seem like pretty odd decisions though. I guess this is going to be as enthusiast-only as the previous Neo phones which is a shame.
richiverse 1 hour ago 2 replies      
Can't you just boot Ubuntu touch on a Nexus 4 or dual boot a oneplus with 3gb ram?
dogma1138 6 hours ago 1 reply      
I never got the purpose of projects like this, at least they admit that there is no current way of guaranteeing a safe device.Even with "open" firmware virtually any commercial hardware can be either backdoored or exploited. Modern phones (as well as any other computing device) have so many software components that i suspect intelligence agencies do not even have to exploit the main user OS to get data of them.Modem and BBP OS, CPU microcode, bootloaders, firmware for other hardware components ranging from the camera to radio's not covered by the BBP, USB/OTG controllers and more.I have serious doubts that "sandboxing" hardware components will actually work, especially in any useable way.For example modems behave badly all the time, both due to internal glitching and due to environmental reasons. Heck dump your BBP logs from your mobile device and see how much shit it pulls off regardless of what the OS tells it to do because it's manufacturer thinks they know better (and they do most of the time), so every time there is some BS signal on the same frequency or the modem decides to wake up or hop channels on it's own the device will reboot? GL making a single call with that then.Not to mention that i really don't see how they are going to do monitoring effectively, from the schematics they are doing power management monitoring which is nice but most BBP packages can receive even in "sleep" mode. The isolation they are suggesting might also not really work, sure putting in "breakers" (which i assume are either some low voltage relays switches or resettable fuses) and cut the power off completely to the BBP might work but cutting off the antenna from the BBP? doubtful RF signals should be able to hop most in circuit breakers, and many BBP modules these days come with integrated antennas for at least some of the low power/low gain RF stuff like BTLP and NFC.To me this seems like another one of these "experiments" that will only result in overpriced and outdated hardware with poor user experience. I would love to see an N900/950 modernization just for the ability to run actual linux on the phone to serve as a toolkit. But if anyone thinks that any device they can hack together will give them any level of privacy over organizations like the NSA is living in la-la land...
SS7: Locate. Track. Manipulate [video]
110 points by moe  19 hours ago   21 comments top 5
moe 19 hours ago 0 replies      
Actual talk starts at 00:16:00 into the video.

Tobias Engel demonstrates (amongst other things):

* How to find out the phone numbers of nearby cellphones

* How to track the location of any cellphone worldwide that you only know the phone number of

* How intercept outgoing calls of nearby cellphones (to record and/or re-route to a different number)

dsl 16 hours ago 2 replies      
I've learned more about how to efficiently seat people in an auditorium than I ever needed to know.

But on a serious note, conference organizers should play close attention to how CCC does stuff and replicate it. The pre-talk on screen information is amazing and useful.

wirefloss 14 hours ago 2 replies      
All TDM and Sigtran signaling links of world-wide SS7 network are configured manually peer-to-peer. The signaling traffic including SMS texts travels mostly unencrypted. Hence it's next to impossible to get a real SS7 Pcap log (requires an NDA), let alone access to the SS7 network, unless you work with a network operator.
sounds 16 hours ago 3 replies      
Should be easy to transcode using VLC and post on YouTube, anyone not on Comcast able to do that for the rest of us?
Timmmmmm 16 hours ago 2 replies      
This is pretty shocking. Shame it is technical enough that it will probably not become mainstream news.
Redesiging a Broken Internet: Cory Doctorow [video]
55 points by getdavidhiggins  13 hours ago   32 comments top 5
wuliwong 9 minutes ago 0 replies      
I always cringe when I see a talk or a blog post in tech talking about "fixing" something that is "broken", especially something as complex as the internet. It's just sensationalism. The state of "brokeness" of the internet is totally ill-defined. I would be much more likely to give this talk my time and attention if the topic was more clearly stated in the title.
JBiserkov 13 hours ago 3 replies      
Judging from the first 5 minutes, this Wired article by the same author contains pretty much the same content in textual form - http://www.wired.com/2014/12/government-computer-security
zaroth 8 hours ago 1 reply      
I wish the title were different, since the talk has nothing to do with a broken internet. But the actual topic; computers working for or against their operators, DMCA as a meta-law to be exploited by private enterprise, and DMCA as restraint on public vulnerability disclosure, are all important public discussion.

The key point I heard, and I'll embellish a little further; -- legislators are passing laws like DMCA thinking they are merely trifling with entertainment options, but they are mucking with critical infrastructure, the central mediating artifact of our lives, maybe even the platform for our existence. Tread lightly.

Animats 12 hours ago 3 replies      
General purpose computers are so last-cen. Today, we have mobile! The arguments about control today are between the mobile network operators, the mobile device vendors, and governments. Users have little if any control or input to that process. Apple doesn't even allow IOS apps to have programmability.
xenishiet 10 hours ago 7 replies      
Cory Doctorow just needs to STFU and go away.

Lets be clear here, CD is primarily a science fiction writer (feel free to look at info up yourself), not a programmer/engineer (like Richard Stallman), not a researcher (like Michael Geist), not an activist (many many examples we all know) or quite frankly anyone of any relevance. He's the new breed of self-aggrandizing web whore that gets himself shoe horned in "tech" sites or simply via his dumpster blog Boingboing. To stay relevant he takes popular internet news (say gamergate) and takes the most hardlined politically-correct stance on it. A great example of this would be how he recently co-authored a fictional book about a "female gamer". I mean... comon...

For the sake of the internet please ignore this person and support people who make a real difference. (I fully expect to be voted down for this by not brainlessly applauding these hipster heroes)

How Google Cracked House Number Identification in Street View
47 points by nkurz  14 hours ago   20 comments top 7
sanxiyn 12 hours ago 2 replies      
This is a tangent, but:

"That's particularly useful in places where street numbers are otherwise unavailable or places such as Japan and South Korea where streets are rarely numbered in chronological order but in other ways such as the order in which they were constructed, a system that makes many buildings impossibly hard to find, even for locals."

South Korea finished renumbering streets in 2011 and after two and a half years of trial completely switched to the new system in January 2014.

giovannibajo1 6 hours ago 1 reply      
In Florence, Italy, in the historical city center, we have an unique numbering system; each street has two series of independent numbering of buildings, differentiated by colors: red numbers are for businesses, black numbers for houses. So for instance a restaurant could be located on the number "23r" (r=red), while the standard "23" (black) can be hundreds of meters away in the same street.

I think there is currently no mapping system that handles this madness. Google Maps still does a decent job if you're looking for a specific place, because people have reported the exact gps positions of most businesses through user-reporting, but if you enter an address with a red number, you're unlikely to be correctly directed.

I guess the neural network knows nothing of colors...

nl 6 hours ago 1 reply      
I haven't read the paper yet, but I don't think this is even the biggest CNN inside Google. The NIPS2015 Hilton/Dean paper talks about a single network trained for image classification for six months on a large number of cores.
cube00 9 hours ago 0 replies      
It would have been good if the article explained the link between this work and house numbers appearing in reCaptcha. Training the network perhaps?
pervycreeper 8 hours ago 2 replies      
Personal experience suggests that they are also using CAPTCHAs for the same purpose. I wonder how that figures in to the project.
softdev12 12 hours ago 1 reply      
The article just skims over this part:

"To start off with, Goodfellow and co place some limits on the task at hand to keep it as simple as possible. For example, they assume that the building number has already been spotted and the image cropped so that the number is at least one-third the width of the resulting frame. They also assume that the number is no more than five digits long, a reasonable assumption in most parts of the world."

This seems like a huge task. Someone has to go through all the thousands of images and first crop them? During that time, it would seem like they could just input the number into a database.

Maybe I'm missing something, but I read the "cracked" part to be a totally automated system that scans all the pictures and pulls the numbers with no human manipulation.

frozenport 9 hours ago 0 replies      
Perhaps the nueral net that Google uses to choose research areas decided to learn how to locate human dwellings?
Masscan: Scan the entire Internet in under 5 minutes
131 points by pmoriarty  22 hours ago   26 comments top 11
twolfson 3 hours ago 1 reply      
"It's the program that scanned the Internet in less than twelve parsecs."
ianremsen 18 hours ago 0 replies      
Note: your ISP and third-parties probably won't like this very much.
yzzxy 18 hours ago 0 replies      
There's a great talk from Defcon 22 on using Massscan for security research:


NelsonMinar 13 hours ago 1 reply      
This is a hell of a piece of engineering. Really fun to read the README, a custom TCP/IP stack is genius.
sirwolfgang 19 hours ago 2 replies      
Title should be updated to include that this system scans only via IPv4. Doing such a thing with IPv6 would be a little more surprising. (7.9228163e+28 times more difficult)
jwcrux 19 hours ago 0 replies      
Here's [1] an example of using Masscan to scan the IPv4 space for shellshock.

[1] http://blog.erratasec.com/2014/09/bash-shellshock-scan-of-in...

pvnick 17 hours ago 1 reply      
How likely is this to be used for anti-piracy efforts? I don't hear much about en masse copyright enforcement these days, but it seems like the ability to quickly scan large IP ranges would allow one to periodically (every couple minutes or so) obtain a list of every single seeded file in the US, at least for the people not using a VPN.
pvaldes 15 hours ago 2 replies      
Sorry if I seem naive but, is this even legal? ...
curiously 14 hours ago 0 replies      
what are ip port scanners commonly used for?
sigmonsays 13 hours ago 1 reply      
great.. now everyone can easily find out if my ssh port is open...
cmdrfred 19 hours ago 0 replies      
The Real-time Web: How to Get Millisecond Updates with REST
148 points by jwatte  23 hours ago   22 comments top 9
vosper 21 hours ago 1 reply      
This is an interesting blog post, but I'm surprised about this part:

"[...] even though we have JavaScript code running in the browser, and it knows about the invalidation of a particular URL, it cannot tell the browser cache that the data at the end of that URL is now updated.

Instead, we keep a local cache inside the web page. This cache maps URL to JSON payload, and our wrapper on top of XMLHttpRequest will first check this cache, and deliver the data if its there. When we receive an invalidation request over IMQ, we mark it stale (although we may still deliver it, for example for offline browsing purposes.)"

I thought that the traditional approach to this would be to add a cache-buster [1] to the query string. Couldn't the Javascript code that knows there's a change update the cache-buster argument and refetch the URL? Then they wouldn't need their own cache implementation.

[1] A cache-buster is typically a random value (or timestamp) included as a query-string argument on the URL. This value has no meaning to the server, but because the URL string has changed the browser can't make any assumptions about what the return value would be, and is therefore forced to make a request to the server.

EDIT: I think I see now - what they refer to as "the cache" is more complex than I had thought; it seems somewhat more analagous to the state in a React app, in that it is triggering UI updates and isn't just a dumb layer between the client and server.

ricardobeat 4 hours ago 0 replies      
There used to be a proposed standard called Rocket, where you could subscribe to an event stream after requesting a resource. The github account has gone dark (apparently bought by CoreOS?) but can be seen here: https://web.archive.org/web/20130824235315/http://rocket.git...?
jkarneges 18 hours ago 1 reply      
Good stuff. The push-invalidation-plus-refetch technique is sometimes referred to as "hinting". Some more info here:http://blog.andyet.com/2014/09/23/send-hints-not-data

I've also been working on a protocol definition for REST updates called LiveResource. Anyone interested in this problem space, please send feedback. :)https://github.com/fanout/liveresource

Kiro 5 hours ago 1 reply      
IMQ is written in Erlang. I also know that a lot of the IMVU backend is built in Haskell. Makes me want to work there.
jonpress 18 hours ago 1 reply      
This makes sense for backwards compatibility reasons since it fits in with the existing REST API. An alternative if starting from scratch would be to design an event-based realtime API that uses WebSocket with long polling fallback.
timeu 17 hours ago 0 replies      
Isn't this approach what service workers are supposed to solve (controlling the cache) ?
rslarner 18 hours ago 0 replies      
Great post Jon. IMVU rocks!
lucio 18 hours ago 3 replies      
is there a more interesting use-case than status updates?
curiously 20 hours ago 2 replies      
Would've loved it if they had open sourced their real-time graph solution.

I still don't quite know when I should use graph database but I imagine for social networking type of websites it is a must (since a standard RBDMS or NoSQL gets too verbose).

An Introduction to Lock-Free Programming (2012)
25 points by dpeck  14 hours ago   1 comment top
toolslive 3 hours ago 0 replies      
you might want to read this too:http://www.mpi-sws.org/~turon/turon-thesis.pdf
How movies embraced Hinduism
38 points by wslh  14 hours ago   21 comments top 7
virtuabhi 8 hours ago 1 reply      
This is the most ridiculous article you will ever read. The author has jumbled personal beliefs of one of the movie crew members, different works of different directors, and complete nonsense. Its title can as well be How movies have embraced Newton's Law (with examples of how action-and-reaction was observed when a guy hit another, and how did the actor slipping on ice did not stop).
enupten 7 hours ago 1 reply      
Reminds of this incident:http://gawker.com/5075336/halloween-overachiever-heidi-klum-...(Note the comments, with accusations about being thin-skinned; while other articles harp on about how "Sexy Virgin Mary" is very offensive as a Halloween costume.)

George Lucas on the other hand also directed "Indiana Jones - Temple of Doom", which portrays Hinduism in the most condescending of ways.

Yoga is detached from its roots,http://www.hafsite.org/media/pr/takeyogaback

And other anti-Hindu propaganda in CA,http://en.wikipedia.org/wiki/California_textbook_controversy...

I'm drawn to believe that ideas usurped from Hinduism,Buddhism tend to be appreciated more, when portrayed as some new-age quasi-spirituality by a Caucasian.

nnain 6 hours ago 1 reply      
Never mind people. We are 1.25Bn people in India, free to speak whatever we wish to (hence so many fake godman's too). Such nonsense keeps coming up all the time. Noone takes it seriously. I'm kindof surprised though that it got published in The Guardian.
giis 1 hour ago 0 replies      
I don't why such articles come-up mixing personal religious belief and fancy Movies. If I'm not wrong the author of this article is from India and Hindu ? As someone who follow Hinduism I don't like to see such articles.
curiousDog 3 hours ago 1 reply      
Haha ridiculous, please tell me this article has been paid for by Hindu nationalists like RSS and their overlord, the PM of India, Mr. Modi.
eva1984 6 hours ago 3 replies      
I heard Star Wars borrow a $hit tons of idea from Japanese culture, which leads me to believe it is more like some variant of Buddhism than Hinduism.
jasonisalive 5 hours ago 3 replies      
Hinduism: an inexhaustible, ever-reshapeable resource for Indian nationalism.
       cached 28 December 2014 17:02:04 GMT