hacker news with inline top comments    .. more ..    10 Mar 2016 News
home   ask   best   2 years ago   
Responsive Pixel Art essenmitsosse.de
1243 points by bpierre  13 hours ago   93 comments top 49
essenmitsosse 5 hours ago 1 reply      
Hello this is Marcus, the guy who did the resolution independent illustrations. I just uploaded the current version of the site, so it will no longer freeze, when the image gets to small. I also updated the Tantalos image.

I am currently a bit blown away by the reaction to this. I finished this over a year ago but didnt manage to make it public till now. I presented this at a meetup in Berlin yesterday and someone asked for an online version, which I postet to Twitter then things escalated quickly.

So because a lot of people are asking: I am currently redoing my completely outdated homepage. The new one will include an in-depth explanation of what is actually happening there, whats the idea behind it and how I want to apply this to actual web design to make resolution-independent work more easy and less restrictive for both designers and developers.

Until then I will try to answer some questions here

Scalable GreetingsMarcus

fredley 12 hours ago 4 replies      
This is incredible stuff. Zeus in particular is amazing. It all seems to be done with weighted objects and a clever way of rendering them, but it's indistinguishable from magic from where I'm sitting.
noonat 11 hours ago 0 replies      
This is really cool! If you look at the source, it appears that the images themselves are defined as JS code, almost like a vector image. For example, Zeus: http://essenmitsosse.de/pixel/scripts/zeus.px

The author then has a renderer to turn these into pixel data. It seems to render them down to an actual pixel image on the fly.

FilterSweep 12 hours ago 1 reply      
The amazing part about the Tantalos slide was that its responsive implementation actually captured the ethos of the myth itself - no matter how far he reached, the fruit moved ever farther away. outstanding work.
hakvroot 12 hours ago 1 reply      
Besides moving your mouse around, zooming in and out also works beautifully (might put a bit of a strain on your system though).

Given the sheer amount of work for one piece, with The Three Graeae appearing to be around 1000 lines of code, I'm also quite amazed he managed to produce seven. Brilliant, and Art indeed.

imurray 12 hours ago 1 reply      
Reminds me of the following different project for content aware image resizing or retargeting:



onion2k 12 hours ago 2 replies      
Awesome. I didn't really get it until I looked at the Brother picture, but that is brilliant.
kdamken 12 hours ago 0 replies      
Are you some kind of wizard? This may be the coolest tech demo I've seen all year.
drcode 12 hours ago 0 replies      
This is the kind of stuff I come to HN for! MORE OF THESE KINDS OF POSTS PLEASE!
dahart 7 hours ago 0 replies      
So freaking awesome, this just rules!

My faves are 1- Zeus, and 2- Teiresias

Those two look like they took the most work, and are most technically challenging. The Man-Eagle-Bull-Snake morph especially.

Halfway random tangent, and halfway related, but I'm really super looking forward to wider adoption of SVG 1.2 precisely because it adds absolute unit constraints in addition to the relative constraints, so you can do some of the same kind of stuff in an SVG image, and have authoring tools to support it. Not the pixel art side of it, and nowhere near as crazy as this project, but it will still be really useful.

TheOneTrueKyle 11 hours ago 0 replies      
As others have stated, this is an amazing demo! I've played around in the past with designing responsive web comics with no real luck, but could you imagine creating a web comic using the techniques used in this demo?

The potential!

onetwotree 8 hours ago 0 replies      
I got a good chuckle out of the Sphinx one -- the guy is standing in front of it saying "look, a sphinx!", and then if you make it vertical he's all "oh shit! a sphinx!". At least that's how I saw it.

Pretty awesome!

jeromeparadis 12 hours ago 1 reply      
Would love to read some explanation on how this is made. Very impressive!
njharman 11 hours ago 0 replies      
I'm a jaded, cynical, cranky, old curmudgeon. Still I say that was pretty flippin awesome. Especially Zeus!
Patient0 9 hours ago 2 replies      
Does this work on anyone else's phone? I just get a heavily pixelated virtually unrecognisable picture. (iPhone 6 - same for both Chrome and Safari)
jak1192 2 hours ago 0 replies      
This reminds me of those flip-books you made when you were younger (or of current age, whatever).
supernintendo 5 hours ago 0 replies      
Very well done! Just a small QA note for the creator, it's possible to hit the "next" arrow until you reach a broken page [1].

[1] http://essenmitsosse.de/pixel/?showcase=true&slide=7

konne88 3 hours ago 2 replies      
Why do people find this interesting? Is there more to it than the insight that one can define a function that maps image-dimensions to an image?
thedaemon 6 hours ago 0 replies      
This is really awesome. At first I was going to write a comment about how this completely ruins pixel art by changing the pixels which are hand placed by the artists. But then after viewing a few images I realized that this is hand tooled changed, not just compression. Bravo.
rhaps0dy 12 hours ago 1 reply      
This is amazing!

I really like that in "Teiresias", if you make the canvas narrow enough, the man jumping (presumably Teiresias) changes position and gets a white beard instead of white long hair. Just to still give the impression that it's an old sage, in small vertical space.

silveira 12 hours ago 1 reply      
That's amazing. What scaling algorithm does it use?
awqrre 5 hours ago 1 reply      
There appear to be a resize bug... for example with The Sphinx, if you go wide and short, the guy is pointing left with his arms, but if you go tall and skinny, the guy is pointing up with his arms...
adam12 11 hours ago 1 reply      
It breaks if you position the mouse to the top or the left edge of the canvas.
sbarre 10 hours ago 0 replies      
So is this some kind of specialized constraint solver? I'm looking at the JS source for the various art and it seems to be defining relationships alongside the shapes and styles.
jraedisch 9 hours ago 0 replies      
I kept resizing the window until I realized that there was a simpler way already implemented. Thought that was a bug first. Really nice!
Jordrok 9 hours ago 0 replies      
Holy crap, very cool! At first glance I thought it was just a neat little resizing trick, but then Zeus morphed right before my eyes and I did a double take. The way it responds to your movement makes it really effective in a way that would be extremely hard (impossible?) to reproduce in a static image or even a video.
jianyuan 7 hours ago 0 replies      
I like how you can save the image as png! Pretty awesome stuff!
SerLava 8 hours ago 2 replies      
>That's right, I'm tracking this side

Do you mean you're tracking visits that hit "view page source"? Does that work? I can't find any info about that on Google.

elwell 9 hours ago 1 reply      
Try hitting Cmd - a bunch of times.
tompetry 11 hours ago 0 replies      
By the beard of Zeus! Bravo, loved this.
nom 6 hours ago 0 replies      
That's how image resizing should behave. But I guess this will only happen when AIs take over ^^
hammock 10 hours ago 0 replies      
How do you get to the other images? Or is it broken on mobile (Android Chrome)
ljk 9 hours ago 0 replies      
pretty cool stuff! but is anyone else not able to resize the drawing after the cursor is moved out of the window when either the drawing width or height is at 1 pixel?
valine 12 hours ago 1 reply      
I wonder if this could be extended beyond pixel art by rendering it with webgl.
Kenji 11 hours ago 1 reply      
I like this. A little bug has snuck into the code somewhere: If you move your mouse pointer directly into a corner, the script fails with "Uncaught TypeError: Cannot read property '0' of undefined". I presume it is a division by zero when the image height or width becomes zero.
spdustin 11 hours ago 1 reply      
Grab the lower right corner of the image frame and drag it.
cousin_it 9 hours ago 0 replies      
Two-parameter morphing, cool :-)
Exuma 11 hours ago 0 replies      
This is absolutely nuts... extraordinary.
Nadya 9 hours ago 0 replies      
This is absolutely bloody amazing. The cleverness with the `Zeus` artwork was equally entertaining as it was bloody amazing .

I'd love to hear about the inspiration behind the project.

alienbaby 2 hours ago 0 replies      
A joke right?
uneewk 9 hours ago 0 replies      
Some pretty awesome stuff!
zaf 9 hours ago 0 replies      
That's great work.
mirap 11 hours ago 0 replies      
Zeus is the winner! ;)
mucker 6 hours ago 0 replies      
I agree with the many comments here. This is stunningly good work down with such a simple canvass. I'll be taking it to show my kids (who enjoy Greek mythology) this evening.
RUG3Y 11 hours ago 0 replies      
This is stellar.
daodedickinson 10 hours ago 0 replies      
Now we just need neural networks to do this in the style of Jan van Eyck.
lpbonenfant 11 hours ago 0 replies      
this is incredibly impressive!
lemiffe 12 hours ago 0 replies      
Wow, epic!
0xADADA 10 hours ago 1 reply      
Should've been tagged <NSFW>
How Web Scraping Is Revealing Lobbying and Corruption in Peru scrapinghub.com
245 points by bezzi  7 hours ago   38 comments top 7
carlosp420 5 hours ago 3 replies      
Hi there, I am the author of the blog post. I will be happy to answer any question.
ecthiender 6 hours ago 2 replies      
Very interesting, how tools like these can be so much helpful for journalists and generally transparency in government functions.

Probably world changing, when considering that even semi-technical folks can cook up tools to dig into things like this.

I know this tool was by a developer, but scrapinghub has web UI to make scrapers.

xiphias 4 hours ago 0 replies      
Can you draw a covisit graph of people? Who visited the building at the same times as somebody else. The strength of the connections could be visitedboth^2/( visitedwithouttheother1+1)*(visitedwithouttheother2+1)))
alecco 2 hours ago 0 replies      
In other countries, corrupt politicians found out a simple captcha per n items is good enough to defeat analysis.
danso 6 hours ago 3 replies      
FWIW, if you live in the U.S., then you benefit from having such data in great quantity, though I don't think it's sliced-and-diced to near the potential that it has:

Lobbyists have to follow registration procedures, and their official interactions and contributions are posted to an official database that can be downloaded as bulk XML:


Could they lie? Sure, but in the basic analysis that I've done, they generally don't feel the need to...or rather, things that I would have thought that lobbyists/causes would hide, they don't. Perhaps the consequences of getting caught (e.g. in an investigation that discovers a coverup) far outweigh the annoyance of filing the proper paperwork...having it recorded in a XML database that few people take the time to parse is probably enough obscurity for most situations.

There's also the White House visitor database, which does have some outright admissions, but still contains valuable information if you know how to filter the columns:


But it's also a case (as it is with most data) where having some political knowledge is almost as important as being good at data-wrangling. For example, it's trivial to discover that Rahm Emanuel had few visitors despite is key role, so you'd have to be able to notice than and then take the extra step to find out his workaround:


And then there are the many bespoke systems and logs you can find if you do a little research. The FDA, for example, has a calendar of FDA officials' contacts with outside people...again, it might not contain everything but it's difficult enough to parse that being able to mine it (and having some domain knowledge) will still yield interesting insights: http://www.fda.gov/NewsEvents/MeetingsConferencesWorkshops/P...

There's also OIRA, which I haven't ever looked at but seems to have the same potential of finding underreported links if you have the patience to parse and text mine it: https://www.whitehouse.gov/omb/oira_0910_meetings/

And of course, there's just the good ol FEC contributions database, which at least shows you individuals (and who they work for): https://github.com/datahoarder/fec_individual_donors

This is not to undermine what's described in the OP...but just to show how lucky you are if you're in the U.S. when it comes to dealing with official records. They don't contain everything perhaps but there's definitely enough (nevermind what you can obtain through FOIA by being the first person to ask for things) out there to explore influence and politics without as many technical hurdles.

jorgecurio 6 hours ago 6 replies      
Really interesting use of data extraction....

For developers and managers out there, do you prefer to build your own in-house scrapers or use Scrapy or tools like Mozenda instead? What about import.io and kimono?

I'm asking because lot of developers seem to be adamant against using web scraping tools they didn't develop themselves. Which seems counter productive because you are going into technical debt for an already solved problem.

So developers, what is the perfect web scraping tool you envision?

And it's always a fine balance between people who want to scrape Linkedin to spam people, others looking to do good with the data they scrape, and website owners who get aggressive and threatening when they realize they are getting scraped.

It seems like web scraping is a really shitty business to be in and nobody really wants to pay for it.

dang 2 hours ago 0 replies      
We've banned this account for repeatedly violating the HN guidelines.

We're happy to unban accounts when people give us reason to believe they will post only civil and substantive comments in the future. You're welcome to email hn@ycombinator.com if that's the case.

Pentagon admits it has deployed military spy drones over the U.S usatoday.com
390 points by jonbaer  13 hours ago   142 comments top 20
deevus 15 minutes ago 0 replies      
Edward Snowden talks about this in Citizenfour.


I also learned at NSA,we could watch drone videosfrom our desktops.As I saw that, that reallyhardened me to action.- In real time?- In real time.Yeah, you... it'll streama lower quality of the videoto your desktop.Typically you'd be watchingsurveillance dronesas opposed to actuallylike you know murder droneswhere they're going out thereand bomb somebody.But you'll have a drone that's justfollowing somebody's housefor hours and hours.And you won't know who it is,because you don't havethe context for that.But it's just a page,where it's listsand lists of drone feedsin all these different countries,under all these different code names,and you can just click onwhich one you want to see.


He doesn't say explicitly that this includes the U.S., but I made that assumption and here it is: proven.

cheath 4 hours ago 1 reply      
I'm pretty sensitive to these things. That said, if you look at the partial list provided (granted this was potentially cherry picked), we're looking primarily at disaster awareness stuff. Flooding, wild fires, and search & rescue. Not spying.

Military resources, such as The National Guard, get called up for disaster relief all of the time. I think I'd rather drones helping people in these scenarios than what they're primarily used for.

spdustin 8 hours ago 7 replies      
It's not just drones. Law Enforcement often flies surveillance flights, circling over locations of interest. These planes often have thermal/visual imaging video cameras and, according to some stories, may also be carrying other tracking or signal interception hardware (think Stingray). HN user jjwiseman [0] scooped most of the press about this.

You can locate such "interesting" flights right now, using your browser. Just open up ADSB Exchange Virtual Radar [1], which doesn't filter out flights with certain squawk codes like other online virtual radar sites do. If you do, select "Menu" from the map (with the gear icon), then "Options". Select the "Filter" tab, select to "Enable filters", select "Interesting" from the dropdown listbox, and select "Add Filter". Now you can zoom out over the country, and see all the "interesting" flights using the table on the left. Note any flights with the "LE" or "FBI" or "DHS" user tag.

Right now, as I write this, an FBI-owned aircraft is circling over the Norwood/Bronx area of NYC [2], tail number N912EX, registered to OBR Leasing (one of the "shells" that the US Gov't uses for registering its law enforcement aircraft), as mentioned in an AP story last summer [3]

[0]: https://news.ycombinator.com/user?id=jjwiseman

[1]: http://www.adsbexchange.com, select "Currently Tracking [number] Aircraft on upper-right"

[2]: http://i.imgur.com/EUYqx98.png

[3]: http://bigstory.ap.org/article/4b3f220e33b64123a3909c60845da...

Edit: Another one, flying around NW Los Angeles, right now: http://i.imgur.com/PwRpqRe.png

Edit2: Any aircraft squawking transponder beacon codes between 4401-4433 are engaged in law enforcement operations. More on the various squawk codes reserved by US Gov't operations can be found here (pdf link): http://www.faa.gov/documentLibrary/media/Order/FINAL_Order_7...

protomyth 12 hours ago 3 replies      
Why, yes, they used one to find a cattle rustler in ND http://www.forbes.com/sites/michaelpeck/2014/01/27/predator-...

I still believe this should have illegal as they had no warrant and it violates the Posse Comitatus Act.

jallmann 10 hours ago 7 replies      
> an unnamed mayor asked the Marine Corps to use a drone to find potholes in the mayor's city

Let's play a guessing game! San Diego? We've got Miramar and Camp Pendleton here, along with crumbling infrastructure and terrible potholes. Although the city certainly doesn't need help finding potholes around here...

Using drones for this kind of thing actually makes a lot of sense, although not a $20 million militarized Predator.

beau26 12 hours ago 1 reply      
It's shameful that

(a) it took a freedom of information request to make this information public.

(b) the Pentagon did it's own internal report and found that there was no wrongdoing.

(c) that nobody in the government is going to hold these clowns responsible or create any sort of legitimate process for determining whether these flights were legal or not.

mmaunder 8 hours ago 1 reply      
It will take a few years, but drone use will trickle down from the military into federal and local enforcement branches.


I'm curious about the flight plan approval and filing process they go through with the FAA.

Also wondering if the flights show up on the 5 minute delayed ASDI API:


And if the drones carry ADS-B transceivers and if they show up on other transceivers during flight. (Which would make them visible/trackable to anyone ground or air based that is listening)


cgriswald 12 hours ago 4 replies      
> "Sometimes, new technology changes so rapidly that existing law no longer fit what people think are appropriate," Stanley said.

Sometimes, maybe. I'd argue rarely. I don't see much difference between an unmanned drone and an unmanned satellite or a manned helicopter in terms of applicable law.

Believing that a new law is needed because computers/drones/robots/AI/whatever now exist can lead to bad laws, or laws that are out-of-balance in terms of punishment. (i.e., commit a crime - 5 years. commit the same crime WITH A COMPUTER - 10 years)

> "It's important to remember that the American people do find this to be a very, very sensitive topic."

I think the media finds this to be an eyeball-grabbing topic, but AFAICT, the American people do not care much about it.

jngreenlee 11 hours ago 0 replies      
There's a lot of talk about the Posse Comitatus Act[0]. The real distinction is in "intent of the mission". A mission that is:

A)Conducted by United States Army or the United States Air Force, and

B)Conducted to enforce domestic policies within the United States

Would be in violation of the Posse Comitatus Act. However, there is disagreement over whether this language may apply to troops used in an advisory, support, disaster response, or other homeland defense role, as opposed to domestic law enforcement.[1]



jameslk 11 hours ago 1 reply      
It seems hypocritical to assume this is any worse than having these drones flying over other countries, sometimes without their consent.
tokenadult 6 hours ago 0 replies      
From the article:

"The Pentagon has publicly posted at least a partial list of the drone missions that have flown in non-military airspace over the United States and explains the use of the aircraft. The site lists nine missions flown between 2011 and 2016, largely to assist with search and rescue, floods, fires or National Guard exercises.

"A senior policy analyst for the ACLU, Jay Stanley, said it is good news no legal violations were found, yet the technology is so advanced that it's possible laws may require revision."

This sounds a lot less dramatic than the article headline, but it's good that this is being reported and discussed publicly.

cbanek 10 hours ago 1 reply      
A couple of weeks ago, I was taking in some scenery around Creech AFB, North of Las Vegas. While there, I saw a Predator (although it might have been a MQ-9 Reaper) being launched from the airstrip.

It looked like a giant model aircraft getting launched when the wind hit it. Pretty cool at the time, although after reading this, I hope it was just a training mission over some non-existent Nevada AFB...

EasyTiger_ 11 hours ago 3 replies      
Next come drone attacks on US citizens? And who is going to stop them, now that they can do anything they want in the name of terrorism.
nerdcity 9 hours ago 0 replies      
>any use of military drones for civil authorities had to be approved by the Secretary of Defense

Gee, what oversight. I'm sure they'll be denying approvals left and right.

Zhenya 10 hours ago 2 replies      
I wonder who mayor genius is:

 One case in which an unnamed mayor asked the Marine Corps to use a drone to find potholes in the mayor's city.

m23khan 9 hours ago 0 replies      
Pakistan says Hi!
madaxe_again 9 hours ago 0 replies      
Legal != right.
l3m0ndr0p 10 hours ago 1 reply      
Those people that are responsible for this activity should be brought before a court and tried for treason.
pinaceae 9 hours ago 1 reply      
Now you know how it feels.

/signed by the rest of the world.

Valley VCs Sit on Cash, Forcing Startups to Dial Back Ambition bloomberg.com
93 points by digisth  5 hours ago   59 comments top 10
tdaltonc 5 minutes ago 0 replies      
In other news, "Suits Make a Comeback!"[0].

Investors can't just sit on money. They have to get returns, and that means that they have to put capital to work.

[0] http://www.paulgraham.com/submarine.html

delecti 5 hours ago 5 replies      
It's really weird how this article tries to frame the situation. It's almost like the startups feel entitled to the funding.

The point of funding should really be to enable faster growth than they might otherwise have been able to achieve, but if a business can't at least survive without huge influxes of investments then is it really a business that they should be investing in in the first place?

pritianka 4 hours ago 4 replies      
I've been in tech only 6 years and I am already bored of these cycles of VCs becoming frenetically exuberant followed by cautious times. Their advice to startups changes depending on what time it is. It's all so predictable yet people are surprised every time. Any entrepreneur building a business factors these in and approaches fund raising based on that knowledge. I don't even know the point of these articles any more.
gooserock 4 hours ago 3 replies      
> Valley VCs Sit on Cash, Forcing Startups to Dial Back Bullshit


estro 39 minutes ago 0 replies      
It seems like these VC cycles are akin to a natural selection process for startups. Those with legitimate market fit and pricing schemes will have the highest fitness and thus survive; new startups will (hopefully although historically not so much) attempt to copy a similarly sustainable architecture. So in essence these cycles are beneficial to startup market health
bitwize 3 hours ago 0 replies      
Wait, what? VCs' valuation of their money exceeds their faith in your "unicorn" startup idea? Ohhhhhh nooooooooooo
rl3 5 hours ago 1 reply      
In theory this won't affect the decision making process of top-tier VCs. A good invesment is a good investment regardless of the prevailing funding climate.

In practice, I'm guessing if the LPs get cold feet, then VCs will be forced to triage their funding decisions accordingly. How much this matters given the sheer size of some funds, I'm not sure.

cm3 3 hours ago 1 reply      
Or they could diversify and invest in more projects with smaller sums, couldn't they? Would probably require more people to manage the increase of investments.
hoodoof 4 hours ago 1 reply      
Investors market. Tighter conditions, preferences, ratchets.
spullara 4 hours ago 0 replies      
I think it is interesting that this article is critical of valuations changing over such large time spans when the public stock market often marks up and down stocks by a significant amount on a daily basis.
How We Build Code at Netflix netflix.com
481 points by hepha1979  13 hours ago   113 comments top 13
mkobit 10 hours ago 11 replies      
I'm interested in knowing more about the "25 Jenkins masters" that they have, and how much they have modified/built for Jenkins to make it work for them.

We are currently in a state of "big ball of plugins and configuration". A bunch of plugins have been installed, and lots of manual configuration has been put into jobs so that everybody has what they need to build their software. It has led to Jenkins being a "do everything" workflow system. The easy path that Jenkins provides, to me, seems like the wrong one - it makes it easy to just stuff everything in there because it "can" do it. This seems to leads to tons of copy/paste, drift, all types of different work being represented, and it is starting to become unmanageable.

Have others seen this happen when using Jenkins? How have you dealt with it?

gjkood 6 hours ago 2 replies      
Major outage being reported worldwide.


Anything interesting deployed in the last hour?

Something in the CI/CD tool chain, Spinnaker, failed for it to move all the way to Live without being caught.

vlucas 11 hours ago 0 replies      
For those wondering about how this applies to Node.js use a Netflix like myself, it's in there towards the bottom of the article:

> "As Netflix grows and evolves, there is an increasing demand for our build and deploy toolset to provide first-class support for non-JVM languages, like JavaScript/Node.js, Python, Ruby and Go. Our current recommendation for non-JVM applications is to use the Nebula ospackage plugin to produce a Debian package for baking, leaving the build and test pieces to the engineers and the platforms preferred tooling. While this solves the needs of teams today, we are expanding our tools to be language agnostic."

Gratsby 8 hours ago 2 replies      
> The Netflix culture of freedom and responsibility empowers engineers to craft solutions using whatever tools they feel are best suited to the task.

I absolutely love that. I'm a huge fan of what Hastings and company have done over there in terms of culture and making Netflix a unique and desirable place to work.

I think it's time for another round of "find a way to make Netflix hire me."

moondev 13 hours ago 2 replies      
Spinnaker is an amazing tool. Really makes it easy to confidently deploy applications via immutable infrastructure.
mattiemass 13 hours ago 2 replies      
Very cool article. Amazing how much tooling Netflix has built themselves.
Scarbutt 12 hours ago 4 replies      
What are the reasons for Netflix choosing nodejs for their front-end server and not java like in their back-end?
markbnj 10 hours ago 2 replies      
I'm a Netflix fan, as a consumer and an engineer, and this blog post just reinforces my fanboi status. Amidst the descriptions of deployment tools and pipelines one thing stood out for me: the fact that AMI bake times are now a large factor, and that "installing packages" and the "snapshotting process" were a big piece of this. Containers are definitely the answer to this problem. You can deploy base images with the OS and common dependencies, and have the code changes be a thin final layer. Of course with such a sophisticated pipeline based on AMI deployment this change would not be trivial for Netflix, but the bottom line is they have described the primary container use case perfectly, imo.
neduma 10 hours ago 2 replies      
How do they 'externalize config' with respect to http://12factor.net/config?
oconnore 11 hours ago 1 reply      
x0rg 9 hours ago 1 reply      
Isn't Netflix using Mesos (see http://techblog.netflix.com/2015/08/fenzo-oss-scheduler-for-... )? I don't understand what role it plays here.
sayrer 10 hours ago 2 replies      
Seems like a nice system, but would be improved by building with Bazel or Buck instead of Gradle.
dang 10 hours ago 0 replies      
Please stop posting these.
The New Mind Control The internet has spawned subtle forms of influence aeon.co
159 points by mark_l_watson  8 hours ago   119 comments top 29
md224 7 hours ago 14 replies      
I have made this case before, but I will make it again.

The Internet is the largest information system in the world, and Google is the primary portal into that information system. Google's "organic" results are accompanied by AdWords results, which are based on a mixture of bid price and relevance. These ads are marked with a small "Ad" label that many people miss, and even those who know they're ads can't really "unsee" those results.

So, searching the world's largest information system provides results which have been biased by money. How does anyone consider this ethical? Why are we letting money influence the salience of information?

What if your local library (you know, those old things) had a card catalog with "sponsored" results? If this already exists, then maybe we're already lost. But it seems to me that as a basic rule of information ethics, the salience of information in a given information system should not be biased by monetary influence. Full stop, the end, no exceptions. If anyone has a counterargument, I would honestly love to hear it, because this has nagged at me for a long time. I simply can't understand how AdWords is ethical.

justsaysmthng 5 hours ago 3 replies      
I remember the '90s, when the Internet as we know it was just a baby... The enthusiasm we all shared for it. Of a better future. Of true democracy in the world. Of free people, free minds.The Internet will be the cure for all the social ills that humanity has experienced in the past.People will trade and talk with each other and that's how we can have peace on Earth!

We will be able to discuss, collaborate, create. We would be able to watch any film, listen to any song.

Some people called us "geeks", we liked to call ourselves "hackers".

We are not just hacking code, we're hacking a new world.


A quarter of a century later and most of those things are now reality. But somehow these great things have brought with them some hidden things. Things which we ignored or brushed off easily back in the day..

Like the fact that the Internet is now populated by the same demographic as the real world, not just hackers and dreamers. Now everyone is online.

We thought it would free us from oppression, but it is becoming the ultimate tool for oppression.

We thought it would give us true democracy, but it is becoming the ultimate moderation system for "foreign" though suppression and group think generator.

We though it will serve our needs, but it is becoming the thing that is telling us what to need.We thought it will satisfy our tastes, but our tastes are now being programed into us by it.

Of course we're still high from all the positive aspects and it's not in our nature to be scared of things, but that will soon wear off .. And when we wake up what will we find there ?

Either way, it is unstoppable and nobody can turn it off. So we only have to wait and see what it will ultimately turn into.

What will it be 25 years from now ? Will we still be able to discuss about this freely ?

jamesblonde 7 hours ago 1 reply      
As a adjunct to this great read, you find out more about this field through the historical documentary, "The century of the Self" by Adam Curtis and the BBC. It is the 'red pill' for understanding consumerism.https://vimeo.com/10245146
erikpukinskis 5 hours ago 1 reply      
Google could send people to places that are contrary to the user's interest, but that's essentially deliberately decreasing the quality of one of their products so I'm not that worried about it. If they do it's a (big) market opportunity for someone else.

I would go even farther: I'm not particularly worried about individual interests at all, on any subject. The Internet is very good at exposing them.

I am much more concerned with bad classes of actors than bad actors. We see many ways in which competition breaks down because entire classes of people benefit from working in synchrony. The classic example is politicians: crooked elections mean longer terms which benefits basically all of them.

The other classic example is the capital class. If everyone in the capital class plays by he rules of property then they can exploit the labor class. Once you're in the capital class there are few reasons to compete with private property. Social pressure mostly neuters whatever capital class activists might try to keep working.

It's these class barriers that we should be worried about. But new weapons (like search engines) and new villains (like Islam) make much better news stories.

Dowwie 4 hours ago 0 replies      
"We now have evidence suggesting that on virtually all issues where people are initially undecided, search rankings are impacting almost every decision that people make."

I dug into his CV and found the following related works:

- recent publications: http://aibrt.org/index.php/internet-studies

- The search engine manipulation effect (SEME) and itspossible impact on the outcomes of elections [http://aibrt.org/downloads/EPSTEIN_&_ROBERTSON_2015-The_Sear...]

A talk that he gave at Stanford about SEME: https://www.youtube.com/watch?v=TSN6LE06J54&feature=youtu.be

- Democracy At Risk Manipulating Search Rankings Can ShiftVoters Preferences SubstantiallyWithout Their Awareness:[http://aibrt.org/downloads/EPSTEIN_and_Robertson_2013-Democr...]

CV: http://drrobertepstein.com/pdf/vita.pdf?lbisphpreq=1

manachar 7 hours ago 4 replies      
This read like anti-Google FUD.

The basic argument is search engine rank determines trustworthiness of a source. This influences people's opinions on politics, what they buy, what they think, etc.

This is absolutely true, and the core of their research (it seems).

But then it goes into FUD territory when talking about Google backing Hillary. Hillary and Trump have received the lion's share of attention in media, social media, and such. Google searches SHOULD show them prominently.

Worse, the article basically finishes up with a "be afraid, be very afraid" approach that rankles me. "The new hidden persuaders are bigger, bolder and badder than anything Vance Packard ever envisioned. If we choose to ignore this, we do so at our peril."

No solutions or deeper analysis. No discussions on how a search engine should rank relevancy to search terms.

I personally have no doubt that mass-media, marketing, and the internet are shapers of opinions. Bias in the media, search engines, and such is a complex topic. Not something that should boil down to "Google could make it so Hillary wins" therefore you should be afraid.

Gatsky 3 hours ago 0 replies      
Here is the actual study: http://www.pnas.org/content/112/33/E4512.full.pdf

I can't see how this is valid at all. We know that polls give biased results unless you are very careful with the sampling. Here people are self-selecting for a poll (eg via mechanical turk). Then you apply a highly contrived scenario that they are googling about a candidate. Then you ask them a bunch of questions, immediately, and proceed to draw wide ranging conclusions designed to increase your self-importance as much as possible. I mean, seriously, it's worse than useless.

This article is also written as if these findings are earth shattering. After conducting a small, biased, invalid study (Asking people in San Diego about an Australian election? How does that generalize to anything?) and finding a large effect Epstein says 'We did not immediately uncork the Champagne bottle'. Is that how psychology research is conducted? Researchers toasting large implausible effects in small biased samples that have no external validity?


ashurbanipal 6 hours ago 2 replies      
Hold on, so we call it "Mind Control" when Google shifts our preferences toward one of two pre-selected choices? What do we call the state of the world that leaves us with only 2 pre-selected choices who happen to agree on 90%+ of all policies?
Mindless2112 6 hours ago 1 reply      
Google doesn't need to use search results to manipulate voting behavior -- they have Google Now.

Google Now currently displays cards to remind people to vote on voting day. Maybe it just happens to be more likely to show up for people that have been profiled as likely to vote for Google's favored candidate.

nunyabuizness 1 hour ago 0 replies      
I once read an article about surveillance with a title along the lines of "A Tale of Two Cities."

In it, the author explains that there are two types of surveillance cities that will emerge in the future: one where every park bench is rigged with a mic, every street corner has a camera aimed at it, and where all the data collected is funneled to law enforcement agencies; if you were mugged on some street corner, they'd be able to react to the crime swiftly and with high accuracy.

The other city is exactly the same, but all the data is made available to all citizens through an open API; so if you wanted to meet with someone on some street corner, you could decide for yourself if it was safe enough to visit, likely preventing the crime from happening at all.

Does anyone know what article I'm talking about?

labster 5 hours ago 0 replies      
It's looking like our only hope left is the recently declassified WMF search engine. I get that people didn't like Lila doing all of that grant in secret, but I find myself not really opposed to to Wikimedia taking on Google. In the long term, someone is going to have to do it.
nsxwolf 5 hours ago 3 replies      
Something like this recently disturbed me. Mitt Romney published a tweet storm a few days ago making his case against Donald Trump. They all showed up in my feed - I don't follow Romney, and the tweets weren't marked as sponsored, but somehow Twitter decided I should be seeing them anyway.

I'd hate to think Twitter decided it was for the public good that everyone read what Romney had to say about Trump.

ikeboy 2 hours ago 0 replies      
>Keep in mind that we had had only one shot at our participants. What would be the impact of favouring one candidate in searches people are conducting over a period of weeks or months before an election? It would almost certainly be much larger than what we were seeing in our experiments.

Reminds me of http://slatestarcodex.com/2014/09/24/streetlight-psychology/

>And in 2015, a team of researchers from the University of Maryland and elsewhere showed that Googles search results routinely favoured Democratic candidates. Are Googles search rankings really biased?

A greater portion of liberals use social media than conservatives (source: http://www.pewinternet.org/2012/03/12/main-findings-10/) Maybe they organically generate more links?

guelo 6 hours ago 3 replies      
This is a really interesting study but it's hard for me to believe in the conspiracy theory that the hundreds of engineers that work on Google Search would be OK implementing a complicated vote manipulation algorithm and keeping it secret. But it's possible, criminal conspiracies involving many people seem to happen regularly in the financial industry.
Sir_Cmpwn 2 hours ago 0 replies      
What I find tragically interesting is that the maintainers of this website (aeon.co) and likely most of the people reading this comment are contributing to the massive dominance the mentioned companies have over the flow of personal information about people online. If I didn't block trackers, it looks like Facebook and Google would both know I read this article, along with Twitter and New Relic.
pcmaffey 2 hours ago 0 replies      
>Power on this scale and with this level of invisibility is unprecedented in human history.

I would argue that the influence of media has always been this powerful. And media has always been biased.

Another angle to look at would be to apply the work of Stanley Milgram re: obedience to authority figures. Our ability to think for ourselves has some evolving to do...

whatTheFuckEvr 3 hours ago 0 replies      
Oh god. This article is so fucking disappointing.

What a quaint boogey man this "Search Engine Manipulation Effect" is. It even has it's own obscure little acronym, SEME, to appear more relevant.

I learned about the subtle effects of advertising by the time I was in fourth grade, and certainly understood how to ignore them by middle school.

Back when special holographic foil comic book covers and trading cards were new, I had already figured out that all of these "collectibles" were mass-produced, and would never wind up as valuable as, say Action Comics Issue#1, despite so many claims otherwise. This was something you could kind of figure out on your own. If your were easily amused by shiny objects though, you might not arrive at the same conclusion.

Meanwhile anyone could figure out that the influence of single frame inserts in movies was as potent and realistic as the subliminal messaging in John Carpenter's Sci-fi movie, THEY LIVE.

So too, with Search Engines.

Figure if a fourth grader can figure out the shenanigans of opinion and belief influence in advertising, and unravel the bullshit of religion before high school ends, then this other newer form of bullshit is similarly debunked by comparable intellects. If you're so stupid that you buy into bullshit, without multiple channels of factual verification, you're your own worst enemy.

Okay, okay, maybe this is good reading material for an elementary school classroom assignment, focused on current events. Sure, why not?

I was hoping this would be about technological manifestations of psychic telepathy through malicious use of functional MRI systems.


knaik94 3 hours ago 0 replies      
As much as I would love to imagine some powerful people intentionally influencing search results on a big scale, Google and Facebook and Twitter have no real way to make money from it. I would argue that it would only drive people away. I am sure that it happens to a certain degree, look at the marketing of Bernie Sanders on reddit, but a much bigger influence is your social circle and your source of new information. Social media mirrors people's attitude. If you see new tweets about an issue you don't particularly care for, there's a good chance that people you follow do care. It's basic psychology that you befriend people who share your views and interests. If all of your friends tweet or like or show interest in something then twitter will assume you do too. Twitter makes money from user engagement and so it's logical to show you things that your friends agree with because chances are you will too. I think a real issue is the lack of a source of unbiased information. Relevant information and information you agree with are very different things.
hellbanner 6 hours ago 0 replies      
So on this note, I just want to point out that everytime the horrors of the new technologies is talked about, something else isn't talked about.

I've seen a number of newer HN account flooding the site with articles... presumably for eyeballs + ad revenue but hey, maybe they just want another link to lose attention..

scottlocklin 3 hours ago 0 replies      
Megacorporations are bad for the internet for certain, but I don't feel very mind controlled by Google, as I almost always use Duckduckgo or Yandex. They produce fairly similar results, and I have the satisfaction of not shoveling the tiniest bit of money at a company I have likened to fat Vegas era Elvis.

I also doubt the results of their research. Nobody is going to vote for Donald Trump because he happens to appear first in a google search; that's just retarded. I think the fact that outsider candidates are locked out of legacy media megaphones and party power structures seems more harmful to democracy, and this has been accepted as "just how it is" for decades.

r0m4n0 6 hours ago 0 replies      
The internet is just a new medium to persuade opinions just as the many before it. Most people don't even know who represents them apart from what's talked about in the news. This study required people to actually browse through this fake search engine. I'm not convinced many people do any research whatsoever (beyond the top of the ballot).

Does this "search engine manipulation effect" have an impact on top of the ballot votes? We still don't know. Does it have an impact on everyone else on your ballot? Nope.

Disclosure... I am the founder of a company that builds a tool for organizations to blatantly tell people who to vote for...

_han 7 hours ago 0 replies      
This topic is also addressed in the last season of House of Cards.
hotcool 6 hours ago 1 reply      
I designed supraliminal posters[1] to counter the covert forms of persuasion like Low Attention Processing marketing[2]. I definitely find them helpful, especially for meditation.

[1] http://zenpusher.com

[2] http://www.neurosciencemarketing.com/blog/articles/low-atten...

daveloyall 6 hours ago 0 replies      
Related: https://googleblog.blogspot.com/2009/12/personalized-search-... "Personalized Search for everyone " (2009)
NoMoreNicksLeft 6 hours ago 0 replies      
Mind control in the same way encyclopedias used to favor subjects that start with the letter A?

I swear, sometimes I think the world is inhabited by p-zombies, who don't actually think things through, but just mindlessly recombine previously consumed memes into (slightly) novel variants.

Was this written by a second grader?

andrewclunn 2 hours ago 0 replies      

Use DuckDuckGo.

EGreg 5 hours ago 0 replies      
This article is speaking about something that has existed as long as mass media has. Whether it's google or newspapers or TV, the companies running the sources we turn to have control over what information we are exposed to, and can influence our views. Our grandfathers read the newspaper, our parents watched TV. What, google is a monopoly? Ok, so that's the big issue.

On the other hand, our susceptibility to having our political system be disproportionately affected by a company or two with a top-down chain of command is a reflection that our system of representative democracy has weak links and can be easily subverted.

I wrote about the solution to this a while back: replace voting with polling! Have people cast their voice for POLICIES not REPRESENTATIVES. It is much more costly to fool all the people all the time, than to fool them at election time, and then go on to lobby the representatives they chose.

Voting depends on turnout, which skews the results and is susceptible to sybil attacks (remember facebook's vote about the newsfeed that got 3% turnout?)

Polling doesn't. It can be refined using better and better statisical techniques. We can gradually replace costly and stupid elections where candidates talk about their penis, with polling of the population on issues like gun control etc. Replace the bickering lawmakers and filibusters with polling and threshholds.


Microsoft will release a custom Debian Linux theregister.co.uk
178 points by l1n  9 hours ago   72 comments top 15
caf 4 hours ago 1 reply      
In tangentially-related news, there was a lot of talk at NetDev about switchdev, a new Linux driver model for hardware-offload switching hardware.

It allows the kernel's Layer-2 and Layer-3 switching/routing configuration to be reflected down into the switch offload hardware, and the switch's ARP and MAC table data to be reflected back up to the kernel stack.

The overall idea being you can continue to use the same userspace tools to configure the routing/switching, and it all just magically goes faster if you have supported switching hardware.


Someone1234 9 hours ago 2 replies      
The article says "Microsoft will release a custom Debian Linux," but the linked Github repository says:

> Q. Is SONiC a Linux distribution?

> A. No, SONiC is a collection of networking software components required to have a fully functional L3 device that can be agnostic of any particular Linux distribution. Today SONiC runs on Debian

harry8 4 hours ago 1 reply      
I imagine anything Microsoft releases that could possibly have gpl software, such as the linux kernel in it will have the most aggressive search for violation of any software ever. Memories are long and that distrust is not going away any time soon.
mankash666 8 hours ago 5 replies      
Why Linux? The networking stack on BSD is superior, and the OS places no copyleft restrictions!

I'm starting to believe that developers choose OS/Tools the are used to (Linux in this case) versus the one best suited for the job (BSD)

cpeterso 7 hours ago 1 reply      
They should call it XENIX. :)
qwertyuiop924 4 hours ago 1 reply      
Ah, how wonderful it will be to live in a world without embrace and extend.

Wait. systemd, kdbus, GNOME and systemd-udevd. Shit.

We have met the enemy, and befriended it. Now we are the enemy.

MayMuncher 1 hour ago 1 reply      
Anybody have any links to switches/routes that support SAI? I couldn't find any
merb 8 hours ago 1 reply      
Actually what Microsoft is doing could be breat.However I don't understand why they even use Jenkins for this project (https://github.com/Azure/sonic-build-tools)I mean I love jenkins, but wouldn't it be at least good if they would've used their own build tool? I mean something like tfs-linux-worker I know that doesn't exists, but if they would've done something they could've done something good somehow.Using jenkins feels like "we can't yet do that with our own stuff"
ajarmst 3 hours ago 0 replies      
On the plus side, that will be the most carefully reviewed, evaluated and checksummed distro in Linux history.
corncobpipe 8 hours ago 0 replies      
I'm sure the lawyers at SonicWall will love this
criddell 7 hours ago 2 replies      
I always thought that Microsoft was blocked from getting into Linux by the terms of their sale of Xenix.
chris_wot 6 hours ago 1 reply      
Satya Nadella is a breath of fresh air. It's amazing the difference in management styles from the Balmer days.

When Microsoft put Nadella in charge, they made a great decision. And I honestly don't say that very often about top level management.

duncan_bayne 7 hours ago 2 replies      
Life imitates (comedic) art ...


thescribe 7 hours ago 1 reply      
I thought they already did, and it was called RHEL. I guess that's not Debian.
chenster 2 hours ago 1 reply      
It's probably also the OS that used to run SQL Server on Linux announced this week - https://blogs.microsoft.com/blog/2016/03/07/announcing-sql-s....
Is this c/10 spaceship known? conwaylife.com
247 points by morninj  9 hours ago   56 comments top 9
pervycreeper 8 hours ago 2 replies      
This writeup (found a few pages into the thread) explains things in a little bit more detail for a newcomer. https://niginsblog.wordpress.com/2016/03/07/new-spaceship-sp...
heavenlyhash 7 hours ago 8 replies      
That's a beautifully concise numbering system for sharing being used there.

Now if only we had descriptions of chemistry that were this terse. Imagine if this kind of problem solving, collaboration, simulation, and instant verification were the norm for synthetic chem. One of the comments -- "[Let's] use gencols to rub the ship against gliders and *WSSs to see whether there is a useful collision to maybe build a puffer" -- just blew me away. If this were chemistry, that commentator would have been suggesting automatic nanomachine factory discovery.

(InChI appears to be close. But vast amounts of data are locked up in obtuse formats are either Assigned-Names-And-Numbers style formats which are useless to indexing and similarity searches, or formats that embed non-relative coordinates in 3d space, etc, in such a way that computing a deterministic ID for sharing is practically a nonstarter.)

ticklemyelmo 7 hours ago 3 replies      
Hint: Click "Show in viewer" in the first message. Zoom out a bit. Press play.
xamuel 5 hours ago 3 replies      
Very nice!

You might be interested in a simple proof I found of why c/2 and c/3 are speed limits for orthogonal and diagonal spaceships respectively.

Definition: In a gameplay of life, an "infinite lifeline" is a sequence of pairs (c_i,n_i) such that each c_i is alive in generation n_i and either c_(i+1)=c_i or c_(i+1) is adjacent to c_i.

Lemma ("Two Forbidden Directions"): Let x,y be any two 'forbidden' directions from among N,S,E,W,NE,NW,SE,SW. In any gameplay of life that starts finite and doesn't die out, there is an infinite lifeline that never goes in either direction x or y.

The lemma's proof uses biology. Say that (c,n) is a "father" of (c',n+1) if c' is the cell adjacent to c in direction x or y. Otherwise, (c,d) is a "mother" of (c',n+1). By the rules of the game of life it's easy to show every living (c,n+1) has at least one living father and at least one living mother. It follows (modulo some more details) that since the gameplay doesn't die out, there must be an infinite lifeline where each cell is a mother of the next, i.e., an infinite lifeline that never goes direction x or y.

Proof of c/2 orthogonal speed limit: If a spaceship went faster than c/2, say, northward, by the lemma, it would have an infinite lifeline that never goes N or NE. The only way it could ever go northward would be to go NW. Every NW step would have to be balanced out by an eastward step (of which NE is forbidden) or the spaceship would drift west. So every northward step requires a non-northward step, QED.

Proof of c/3 speed limit for diagonal: A diagonal spaceship faster than c/3, say, northeastward, would have an infinite lifeline that never goes N or NE. The only way for it to go northward would be to go NW. Each NW step would need at least two eastward steps in order for the ship to go eastward, QED.

nkrisc 7 hours ago 2 replies      
Skimming through the thread, I realized I had no idea of the type of community that exists surrounding Conway's game. I think it's awesome.
iamwil 3 hours ago 0 replies      

This link has an animation of the c/10 spaceship.

stcredzero 4 hours ago 0 replies      
I'm thinking about writing a Conway's Life MMO, where you can activate "lanterns" that illuminate rectangles in the grid with Conway Life squares. These lanterns are fueled with "living" Conway Life cells, which are harvested by the player. Sound interesting?
stephenitis 6 hours ago 2 replies      
I'm confused about what I'm looking at... is this a pattern that emerges in conways game of life?
Mauricio_ 6 hours ago 0 replies      
Announcing R Tools for Visual Studio microsoft.com
245 points by brettcannon  11 hours ago   71 comments top 17
smortaz 10 hours ago 3 replies      
Hi folks - Last year we polled HN on whether there'd be interest in R integration in Visual Studio. You said "YES!", so here it is! I'll be around in case you have any questions.

BTW RTVS was built by the same group that made PTVS (Python Tools for VS) and NTVS (Node.js Tools for VS). RTVS will also be free & open source of course.


Gratsby 8 hours ago 0 replies      
Microsoft is trying really hard to get me to like them again. It's starting to work.
Mikeb85 9 hours ago 2 replies      
While I'm somewhat ambivalent about Windows-only tools, it's nice to see R getting love from all the big players. The more market share and developers using it, the more the tools and implementation improves, and the better for all of us. Just finished an assignment using R Studio, Knitr and Plotly, so much less pain than Excel+macros or Stata.
hirenj 9 hours ago 1 reply      
This definitely looks like something I need to check out. I do a lot of one off analysis for small bioinformatics projects,and I've taken to sharing results in the Microsoft ecosystem via OneNote. I've wrapped knitr into my own library (knoter) which generates the html and figures, and pushes the whole lot up to OneNote in the selected notebook. OneNote works well for this kind of collaborative analysis, where I need to keep track of the whole discussion somewhere.

One thing I'm missing from my workflow would be a way to integrate in to an IDE so I just push a button, and it'll commit the code to a gist, and push the output to a OneNote for other people to comment on. I'm wondering if it would be possible to fork this, and tweak the calls to knitr so they use my library instead.

tyfon 11 hours ago 1 reply      
I got to test this tomorrow at work, however loading up a huge program instead of a web page where all the stuff is stored on a server is going to require some changes in behavior.

And I usually work from Linux, so it will be in a VM. But I'll try :)

markbao 7 hours ago 0 replies      
This looks stunning. Anyone know how it compares to RStudio? I might virtualize Windows to use this.
stevehiehn 11 hours ago 1 reply      
Sweet, If this works on the free community edition I'm gonna try it for sure.
namelezz 7 hours ago 1 reply      
Why not Visual Studio Code?
swalsh 9 hours ago 1 reply      
Stupid question, is it possible to take an R script and compile it to a .net CLR dll?
TheLogothete 10 hours ago 1 reply      
Bring the VCF from Azure ML to Visual Studio! I said it in the survey too. VCF is very important in some settings. Being able to code R + SQL and have workflow components in the same environment would be a killer feature.
_Wintermute 9 hours ago 0 replies      
Might try this out as I've had nothing but issues with the Windows version of RStudio.
myth_drannon 11 hours ago 2 replies      
Looks like a promising alternative to Rstudio on a Windows platform. If only it was available on Linux...
melling 10 hours ago 5 replies      
Does anyone have a short list of the best sites to learn R? I only have two on my list, and one is for advanced users:

http://adv-r.had.co.nz - Advanced R by Hadley Wickham


I keep my links on Github:https://github.com/melling/ComputerLanguages/blob/master/r.o...

kefka 9 hours ago 1 reply      
That's what I never understood about Microsoft. They're a software company. And some of it is absolutely horrible (Sharepoint), and some of it is the best in the business (Visual Studio).

There's a significant mindshare in Linux and Open Source. That being said, I don't understand why MS didn't provide Visual Studio, Office, and similar for Linux at a premium. For example, if Office was $499 for Windows, charge $999 for Linux. That way, they get the best of both worlds (use their software, pay them money). And their mindshare is significant as well, and this would increase it.

Maybe finally they are coming to their senses, doing just this. It's about time.

eruditely 6 hours ago 0 replies      
Microsoft is getting better and better, this is the rise of a new company.
cbo8of 7 hours ago 0 replies      
No love for C11?
vegabook 10 hours ago 5 replies      
Personally never did R studio. Unfiltered command line is always more flexible, and prevents inertial lockin to a specific tool, OS, to a large extent. Now I'm supposed to do R in ultra-bloated Visual Studio?? I learned R precisely to get away from the inefficiency of the (in the past 20 years, functionally unchanged, visual-candy-only) Excel. How am I to be excited about adding this thick, lumpy MS gravy, to my pure R experience? So that when I do R on Linux or remotely I'm screwed? No thanks.

Addendum: Clearly the hive mind MS corporate drones are out in force today/tonight. I know everybody is enamoured with MS-Eclipse etc but R is best used unfiltered. Not hijacked into the laughable world of Windows and Visual Studio. I know from bitter experience that the Windows versions of R are terribly unstable by comparison with the Linux builds. I learned the latter precisely for that reason. MS is playing a fantastic marketing game but I was agnostic on platform until R on Windows started showing its catastrophic limitations. It's a second class citizen as soon as you venture beyond the basics. Take it from an ex R-on-Windows guy who uses R 10 hours per day.

Painkillers now kill more Americans than any illegal drug vox.com
322 points by davidbarker  13 hours ago   256 comments top 27
krschultz 11 hours ago 23 replies      
I've never taken an illegal drug in my life. I've smoked a cigarette about 5 times. I drank in college but lately I've cut that out of my life as well. I had a security clearance with drug testing requirements for a while and now I just don't like the feeling of a hangover from alcohol or the risk of ingesting random plants/chemicals made by shady people.

In short: I'm the most vanilla, square, anti-drug person you can find. I don't want to use them, and I think other people would be better off if they reduced their usage as well.

Yet I can not for the life of me understand why drugs are illegal. Not just pot, all drugs. I'm totally onboard with making it our public policy that we want to reduce the use of drugs. That makes perfect sense to me. It does not make sense to me why anyone still believes that using the criminal justice system as the mechanism for getting to that goal is the right path. We are spending insane amounts of money on a failed approach while also generating huge negative side effects by creating an enormous group of people with criminal records. It's probably the worst thing this country has done to our own people since segregation and it seems like all of the policy people understand this. Why can't we get political will to do something different?

disposeofnick9 11 hours ago 2 replies      
I've lost at least two elderly, extended family members this way. Both applied both a patch and took a pill, which caused an OD.

The issue is that many opioids and non opioids gap between the therapeutic dose range and LD50% is often dangerously narrow.

Complication #0: serum bioavailable molecule assay is rarely monitored. People metabolize and clear drugs at vastly differently rates.

Complication #1: Hospital mistakes still happen quite frequently, despite many measures to prevent them, especially with inexperienced and overworked nurses/assistants.

Complication #2: cummulative dosing error or interactions, especially multiple, independent prescriptions for similar opioids with different administration routes (patches, sprays, pills, injections)

Complication #3: overprescription of opioids because they're cheap, especially to veterans, which also leads to prescription and hard drug addictions.

Solution: opiods need to be singularly controlled at home or in the hospital by an integrated, blood/interstitial fluid measuring/dispensing unit to avoid OD and push back on abuse.

Plus, anyone taking opioids should also have narcan or equivalent antidote readily available, and wear a medalert QR code bracelet which lists relevant conditions and medications should they be found unresponsive.

Finally, avoid painkillers as much as possible and take the least dose which reduces stress level.

6stringmerc 9 hours ago 0 replies      
I have chronic pain due to a heriditary, incurable condition. Right now one of my ankles is in an elastic wrap to ease the irritation from arthritis. In our home, we did have a bottle somewhere of Tylenol 3, aka codeine. Regular acetominophen was usually what I got to help with an issue.

As I grew into adulthood, I knew the pains I experienced were directly related to my condition, and it was my desire to not really 'cloak' the pain, but avoid it in the first place. Preventive if you will. It helps, but it's clear to me that I wanted to be healthy, and if I have to occasionally take something, so be it. Naproxen sodium has worked quite well of late.

The point of all this rambling is that I simply don't want the hassle of becoming addicted to pain pills. Or sleeping pills. Or nasal spray when it's allergy season. I've lived with pain so long for my life that I'm kind of used to it, and I do say so as a point of pride. It's the body I was born with and it's the one I'll have to use for this gig, take care of it.

I don't fault people for wanting pain treatment. I think the way the system was set up with pills flooding the US was incredibly destructive, and highly indicative of the dangers of for-profit medicine as a system. Toss in the DEA's drug laws and it just turns patients into criminals and that benefits only a very limited group.

When I eventually started seeing commercials on TV for a treatment for opiod induced constipation all I could think about was Trainspotting and that we have a real, genuine problem on our hands in the US.

musgrove 9 hours ago 2 replies      
If 47k deaths per year is an "epidemic" as the author terms it, the 610k that die each year from heart disease must be an all-out pandemic.It never was a war on drugs. Drugs are inert and aren't capable of fighting a war. It's a war on addiction. And good luck winning that on a national scale with "laws."
brandonmenc 8 hours ago 1 reply      
Lots of comments from people who have no experience with chronic pain aghast that doctors would fulfill patient "demands" for painkillers instead of treating the underlying cause.

Pain is self-report, so all a doctor can do is prescribe based on patient demand. Maybe they can't identify an underlying cause, or maybe the treatment (ex: back surgery) is too risky.

Spine surgery that might not work and can leave you with say, loss of bladder control? I'd take the pill every time, and if my doctor didn't just hand it over, I'd find another doctor.

Havoc 4 hours ago 0 replies      
I try to stick to Aspirin & Paracetamol for this reason. Even Paracetamol feels a touch dicey @ liver failure stats.

However I've been in decent pain for 1 year+ before so I know what its like & can totally understand why people go for the powerful stuff. Continuous pain like that slowly but surely grinds your psyche to fine dust over the long run. Thats the part that people without chronic pain miss...

AlleyTrotter 7 hours ago 0 replies      
Simple commentWhat about the people who find relief from chronic pain with opioids and have no other option? These people are the ones who will suffer from the " we know whats best for you crowd"
c3534l 2 hours ago 0 replies      
This isn't really surprising. Opiates have long been used as both recreational drugs and effective analgesics. All the major opiates people abuse besides opium itself were created at one point or another as a painkiller. It's unfortunate, but they're also really good at their jobs. I think that if you need prescription painkillers you should have them. Taken without wanton disregard they're actually fairly safe, although physical addiction is always possible.
jrapdx3 5 hours ago 1 reply      
It's troubling the way this article presents the issue. Treating chronic pain is an enormously complex problem that clinicians have to deal with, especially as it gets bound up with collateral pitfalls of drug dependence, politics of health care delivery systems and conflicting pressures from patients, government regulators and others.

While unethical prescribers (not all are physicians) contribute significantly to rising misuse of opioids, the vast majority of practitioners want to do what's best for patients. As the article notes, there are few options for managing chronic pain, leaving opioids the only realistic choice in many instances.

None of the providers I know think opioids are preferable, but more like a necessary evil. They prescribe opioids sparingly, reluctantly, diligently. Patients have told me it's become increasingly difficult to get prescriptions for quite modest doses of opioid agents they've used for years without dose escalation. The tendency to throw babies out with the bathwater is not unique to this situation, but no less problematic.

Blaming pharmaceutical companies doesn't seem a constructive approach. Probably there's a lot of R&D going on in this domain without much success, meaning it's a very hard problem to solve. I'm certain that a major breakthrough would be eagerly marketed, highly likely the profit margins would be huge. Meanwhile, we're left with the status quo, and manufacturers are meeting market demands. Isn't that how our economy works? Pharma sales are already more highly regulated than nearly all other industries, what more should be done?

Few legal drugs are as controlled in the US as schedule II opioids. If there were no such controls it's likely that the number of overdose-related deaths would be higher than it is. No one knows what solution will work, the need to be being careful about changing "rules" should be obvious.

The article's advocacy of "medical marijuana" as an alternative is IMO inappropriate. Simply enough, research on the uses of cannabis components for pain treatment is in very early stages. Specific indications and side-effect risks are inadequately understood. Recommending use of these components as treatments for pain is premature.

joveian 11 hours ago 1 reply      
This seems like a particularly limited article, although a better slant than many. The NY Times just had this (also not wonderful, but with some additional information) article a few days ago:http://www.nytimes.com/2016/03/07/us/heroin-epidemic-increas...

While the title mentions heroin, the article at least mentions that deaths are frequently due to more deadly prescription painkillers being mixed in. One thing I wonder that I haven't seen addressed (I'm not sure if there is even data available) is how many overdose deaths are due to use of multiple drugs at the same time (alcohol for example makes many drugs more deadly).

Hopefully there will be more and better reporting on the issue. IIRC (and wikipedia agrees at least), these numbers mean that drug overdoes are now killing non-trivially more people in the US than car accidents.

ashwinaj 6 hours ago 0 replies      
This is why you need a counter balance to monopolistic tendencies of the free market. Be it in the form of regulations or making companies liable for their greedy actions.

It has been proven time and time again that systematically removing "common sense" [0] regulations only harms society in the long run.

[0] Please don't start a mundane discussion about what "common sense" means.

brbsix 4 hours ago 0 replies      
The really sad thing about this is that nature has a remedy for the grip of opiate addiction, iboga, yet it is illegal as well.
user_0001 12 hours ago 3 replies      
What are the rates like compared to other countries?

Does the US just over prescribe painkillers, meaning more flood to the blackmarket?

Is it people are getting it from the Dr and accidentally ODing?

Are the Drs prescribing without care, so those who want the drug for a high and no medical reason can?

I never knew painkillers to be used as party drug / fun drug in the UK (outside of the heroin using demographic) nor ever heard of some one ODing on prescribed painkillers.

Seems strange it is such a big issue in the US

mc32 12 hours ago 7 replies      
So painkillers used against prescription kill more people than any individual illegal drug and since people demand painkillers to treat chronic pain physicians are looking to treat chronic pain with alternatives one such is using MJ as one of those alternatives because misuse doesn't result in fatal overdoses, generally.

Vox, stop with the hyperbola.

nathanvanfleet 8 hours ago 2 replies      
Just so you know, opioid pain killers are actually not useful for chronic pain at all. Over the long term it actually makes patients sense of pain GO UP. It's excellent for non-chronic pain however.
lazyant 6 hours ago 1 reply      
Are there any alternative treatments of pain that don't involve drugs? for ex http://www.570news.com/2016/03/06/waterloo-man-praises-local...
cpfohl 12 hours ago 2 replies      
Knowing what I know about opioid painkillers, I don't think I'd ever accept a script for them. I'd accept them in the hospital, but never in a bottle that goes home with me...
njharman 11 hours ago 0 replies      
Joining cigs and alcohol, eh?
swillis16 12 hours ago 1 reply      
All it takes is one or two extra pills to get high from the standard opioid pain prescription. It would've been nice to see this mentioned in the article but it seems pretty light on content.
tosseraccount 9 hours ago 1 reply      
Rhetorical question: how many of these deaths also involve alcohol?
yarou 2 hours ago 0 replies      
For some reason, I never saw the appeal of opiates.

Granted, I use them somewhat occasionally (as needed) for pain, but they don't really cause in me the compulsive, addictive behavior I've read about. My internet addiction (HN included) is far worse than any chemical substance I've ever used.

joesmo 10 hours ago 2 replies      
Send patients home with Narcan and train the people they live with to administer it as well. Have every EMT, firefighter, and police officer in the nation carry and know how to administer Narcan. Have it be part of every single first aid kit sold in this country. Remove the social stigma of drug abuse. Remove penalties for people who help others who are overdosing. You'd think someone would have some common sense in this country but you'd be wrong. It will never get better the way things are going now. It's ridiculous to even have a fucking article like this that doesn't mention the numerous tried and true solutions that exist but are simply not being put into place because the people in power in this country want to see people dying.

The problem isn't that we don't have solutions. Solutions are a plenty. The problem is that no one in America cares. No one in this country gives a shit that people are dying. Most people want it to happen. They support the fucking drug war. They want people to die. Until this fucking shit changes, people will continue to die and idiots will continue to wonder what can we do? So many fucking things, I don't even have time to write them all down. That's the fucking sad part.

FussyZeus 11 hours ago 0 replies      
Is this really surprising? All the benefits of illegal drugs without the risks involving prison time and public disgrace. All you need to do is figure out what things to tell your Doctor to make him think you need one of these things, and you have a legal (and probably insurance funded) supply.

Not saying of course that everyone who gets these doesn't need them, I'm sure many do, but we have something like 90% of the worldwide consumption occurring in the States, so something is clearly up.

julie1 12 hours ago 0 replies      
Prescribing opium ... a trend that have not been seen in the world since victorian era in the UK.

Opium having the reputation to make people amorph losing their will to rebel.

The new trend is opioid are now cheap and not prescribed to the rich but the poor.

Religion used to the opium of the people they said, and now that opium is cheap, religion is not needed anymore to make people servile.

I love this new era of progress.

Tomorrow we make an application to help parents poor sell their kids body part on the internet for the cure of richer people?

I mean, let's try to make even more dystopic. We can do better. That is what progress is. Making system more efficient.

bobby_9x 10 hours ago 0 replies      
It makes sense just based on statistics. Americans have more access to pain killers than any illegal drug and just based on this, will result in more mis-use (and death).

If illegal drugs were all made legal tomorrow, we would see something similar.

fapjacks 8 hours ago 0 replies      
Kratom can solve this problem, but the FDA won't have it.
Lave: eval in reverse github.com
8 points by danso  1 hour ago   1 comment top
javajosh 5 minutes ago 0 replies      
This is very interesting. It is a serialization format generalized to handle real references, including circular ones, and more complex data-types than JSON can handle. I think it's a very poor tagline, since "eval in reverse" sounds useless, and this lib is anything but. (no affliation btw)

The down-side is that a consumer has to run a full-blown eval() (as opposed to the more restricted JSON.parse()). This isn't that much of a downside in a typical webapp since you have full control over the browser process anyway, but it's deadly for cross-domain.

The upside is considerable for certain data-structures that are hard to represent in JSON efficiently, e.g. with a lot of denormalization.

A key concern for me is runtime efficiency, particularly compared to JSON.stringify.

Ansible vs. Chef (2015) tjheeta.github.io
85 points by fanf2  7 hours ago   34 comments top 13
dkarapetyan 1 hour ago 0 replies      
If you're using yaml and templates then you've already lost. The only tool in this game that is not braindead is chef. Sometimes you need imperative things and conditional logic with iteration. If you don't have a real programming language then the contortions you have to go through gets really old really fast.

As for the deployment patterns. If you're in the cloud then you should be baking AMIs (or equivalent in your cloud provider) and shipping your configuration the same way you ship your application code, as native packages like .deb or .rpm. If you jumped on the docker bandwagon then your hosts are basically there to look pretty and host the containers which means you have some other way of getting configuration to your servers, i.e. etcd, consul, etc. so the problems brought up in this post don't exist in that setting. You are also probably using some kind of container orchestration system like kubernetes so again the problem of orchestration and deployment is offloaded to some other system. The only problem you have in that setting is doing a rolling deploy of containers and halting when things go wrong.

I think the only place any of these tools make sense now is some private on-premise cloud. Ever other place has already moved on.

mdeeks 3 hours ago 2 replies      
Ansible has scaled really poorly to thousands of hosts for us. Things we have run into:

- Running a job against a single host will finish in 3 minutes... running that exact same job against thousands will take well over an hour and max out your machine.

- Running against more than around 3k hosts will somehow consume all 60GB of RAM and trigger the oom-killer

- CPU usage on the ansible runner is absurd for a large amount of hosts. We're currently using a c4.8xlarge (our biggest box) just to run deploy jobs and have them finish in a reasonable amount of time (10-15 minutes)

Slicing up our inventory into chunks and running them on different servers sucks big time and is pretty hacky. How do I combine the results? Can't do orchestration like "Run X on these roles first, then run Y on these roles when you're done".

Most likely what I'm going to do is have a single server execute ansible doing only the following in async (aka CPU friendly) mode:

- Upload a current copy of ansible to S3

- Upload the configs to the target machines with ONLY the secrets that role needs in plain text. (I'm not putting my vault secret on every box!)

- Have the servers pull it down and execute in --connection=local mode.

- Wait until each remote finishes

All that said, I LOVE LOVE writing stuff in Ansible. It is so easy to read, follow, and understand. I picked up most of it in a day or two just by reading their "Best Practices" page. Getting it to work at scale hurts though :(

falcolas 4 hours ago 1 reply      
> it works reasonably well on the scale of thousands of hosts

I could see this if you're working from one really powerful machine... no, that won't work, it's constrained by SSH, not hardware specs.

I could see this if you're calling Ansible on another host... no, then you have to copy everything out to the sub hosts, who have to copy everything out to their sub hosts... A scalability nightmare.

You can use redis as a distributed store of truth and... wait, what? Now there's a blog post worth reading. Show us how to scale Ansible with real-world examples using redis and autoscaling groups. Please.

Got it. I can see this work reasonably well if you're willing to wait 10 hours for a deploy to complete. Personally, I'm not.

> If you want to have 1000 forks, that will cost about 30 GB of memory

Ansible is not, nor has it ever been, limited by available memory. It's limited by the number of concurrent SSH sessions it can handle while copying every single module to be executed to that host.

There's plenty of reasons and ways to use Ansible for deploying code. Some of the post has accurate and reasonable information, but the scaling portion is pure fantasy right now.

heavenlyhash 1 hour ago 1 reply      
This might be off-topic (or it might be a breath of fresh air if you're tired of configuration managers) --

I've been toying with the idea of making a trolling-but-no-really deployment framework called tarpipe, and all it does is take some files on your host, get em to a $place in one step, and run hook.sh. Oooooptionally, do some dir moves and symlnks to keep a prior state backed up, and service stop|start on either side of the mv/ln, to minimize downtime.

Usage could be `tarpipe ssh user@host` or `tarpipe <(echo "cd keke && bash -c")` just as easily.

It goes without saying that this simply wouldn't be comparable to Ansible, Chef, or other CM because it's too simple. It doesn't help manage state if it escapes $cwd. But if your application can curb its enthusiasm to a directory... boy is it simple if that's true.

I already do this on the daily to crashland my bashrc and dotfiles on any new remote host. Maybe this kind of explicitly zero-dep deploy would be useful for more situations.

Would anyone have a use for that?


EDIT: What the heck: I prototyped it: https://gist.github.com/heavenlyhash/b575092aa84ce9f3e1d2

danielvf 2 hours ago 0 replies      
I have done several small deployments (under two dozen servers) with Chef, Puppet, and Ansible. I've also evaluated Salt and worked with other people's Bconfig systems. Ansible is the best for my situation, hands down. I no longer have any of the others in use.

Ansible is easy to reason about - it's never surprised me once in use. You have about an order of magnitude less to learn when compared with chef or bconfig.

Also, for setups with small target VM's, it's increadbly handy to not have to install a bunch of stuff on each server and make sure it doesn't conflict with anything else.

But mostly it's that Ansible can be understood enough without devoting a couple weeks of your life to it. And you can come back later and understand what you have written.

dmourati 5 hours ago 0 replies      
If someone will rewrite this article as Ansible vs. Puppet I will not only buy you a beer but I will throw a parade in your honor.
jv22222 4 hours ago 4 replies      
If you are thinking about working with either of these (or puppet) please check out salt stack it is a pleasure to work with and equally as powerful.
LukeHoersten 1 hour ago 1 reply      
Some things I really like about Ansible are:

- Super simple declarative yaml configs

- Agentless. You needed to have SSH working anyway so Ansible just uses that. With ssh pipelining it's so fast.

- The community support is huge and extensive.

- They have a module for everything and development is constant and active, much of which from the community.

- Hardware and networking equipment can be provisioned just the same as a VM or OS image.

The list goes on. Definitely give Ansible a try.

3lux 3 hours ago 0 replies      
If you'd like something simpler than both, have a look at: https://pressly.github.io/sup
glasz 4 hours ago 0 replies      
ansible ftw. i really regret having started with chef a few years back.
GauntletWizard 4 hours ago 0 replies      
This is pure bunk. Let's go down the list, one by one:

Maintenance:Sure, chef has a server component. So does ansible, if you're using it the way he suggests (with a host periodically running ansible playbooks on all hosts). Ansible has no client component to upgrade, though, so that's a win, right? It totally is, until Ansible doesn't work on a host and you can't figure out why, and the error logs you get are useless because some of Ansible's many assumptions about what the host's initial state is are incorrect. Chef can be managed by a standard package manager, which costs nothing on the client side, and allows far, far better assumptions to be made.

For the record, I eventually gave up on the chef server, replicated the playbooks to each machine (using a cronjob and git), and chef-solo.

Speed:Ansible pipelining speeds it up, significantly. You can almost get one command a second! Chef runs on host. It is ruby, and goes slow, but I have programmed a lot of chef and run a lot of Ansible, and my average chef run was under 30 seconds, and I've yet to have Ansible run any playbook in under a minute. Some of this is from atrocious default behavior, like requiring all hosts to complete a step before moving on to the next step on any hosts, or the fact that it spends nearly 10 seconds of cpu time on each machine 'gathering facts' at the beginning of it's playbooks, even if none of those facts are ever used.

Fact caching:This is a solution to the aforementioned problems with Ansible. It may make sense in the chef-server context, but I don't have a whole lot of experience with it.

Tags:This is probably a matter of personal preference, but I prefer to give the set of things that need to be done, and have the tree descend downwards based on dependencies from there. The author clearly prefers to specify with tasks when they should happen, and for each host a set of initial circumstances. This one can be argued til you're blue in the face. I make the point that there's a clear tree that can be built of dependencies under my scheme.

Push vs Pull:There's no maintenance cost to upgrading? What is this dude on? When you change Ansible revisions, you have to do just as much work adapting as from chef revision to chef revision. Ansible has always been highly in flux, and not great about not changing default behavior.

Pulls still have to be triggered, but they can (and should) be triggered on-host, in a cronjob. Your monitoring system should alert you when the chef run is out of date, though, honestly, if it is failing on just some of your hosts you need to clean up and unify your infrastructure.

Raw numbers:Ansible scales one large machine. Chef costs you a tiny amount on each machine. One of these scales. One of these does not.

Search and inventory:Oh gods, if you're using Ansible for inventory managment, please don't. If you're using chef for inventory management, please don't. Neither are reasonable tools for the job.

Orchestration:Neither chef nor ansible are appropriate tools for dealing with your application's data model. Full stop. Actually, full stop. There's nothing else of value further down this article. Please don't take any of it's advice.

jdubs 4 hours ago 1 reply      
"but I prefer using git submodules" - nope.

Great tutorial!

awjr 4 hours ago 1 reply      
It may be me but I rather like http://kubernetes.io/ as this takes the approach that you are deploying an application not a set of disparate services.
RIP Google PageRank score: A retrospective on how it ruined the web searchengineland.com
230 points by adamcarson  11 hours ago   104 comments top 18
jandrese 10 hours ago 6 replies      
I feel like no matter how Google or any other search engine ranked pages, SEO firms would be there to game the system and make a mess of the web. Making Pagerank visible to people with one particular toolbar does seem like a fairly major misstep on Google's part. The majority of the people who would really care about that are the kind of people you shouldn't be encouraging.

One of the obnoxious things about SEO is that if one person is doing it everybody has to do it. It's not necessarily enough to simply offer a better product at a better price. Luckily Google does try to reduce the effect of SEO. I notice for instance that StackExchange almost always beats out Expert Sex Change links these days.

rco8786 10 hours ago 5 replies      
Really crummy title. PageRank is the reason we HAVE a search as powerful as Google, and largely the reason the web is as good as it is today.

Raise your hand if you want to go back to AltaVista/AskJeeves.

justinlardinois 10 hours ago 2 replies      
I never knew PageRank scores were visible, and I never used the web before Google.

But this article is so far up its own ass.

> Ever gotten a crappy email asking for links? Blame PageRank.

Never mind that web rings were around long before Google and used the same tactics.

> Ever had garbage comments with link drops? Blame PageRank.

There are way more reasons spammers exist than just boosting PageRank.

The author is acting like a) Google had less of an influence on the web before PageRank was public information and b) the web was somehow better both back then and before Google existed. There will always be people who want to game search engine results, regardless of how much information they know about their own standing, and the web was pretty much un-navigable pre-Google.

jzawodn 9 hours ago 1 reply      

Back in 2003 I wrote:

"PageRank stopped working really well when people began to understand how PageRank worked. The act of Google trying to "understand" the web caused the web itself to change."


It's amazing that it took this long.

tyingq 10 hours ago 0 replies      
The real problem is that Google was losing the link spam war until very, very recently. It was trivial to game them up until 2010, and only really became relatively difficult somewhere around 2012.

And, the solution looks roughly like "weigh established authority to the point where it trumps relevance".

Animats 8 hours ago 1 reply      
Google has dealt with web spam by replacing it with their own ads. Search for "credit card" or "divorce lawyer". Everything above the fold is a Google ad. Air travel searches bring up Google's own travel info. No amount of SEO can compete with that.

(I still offer Ad Limiter if you'd like to trim Google's in-house search result content down to a manageable level.)

kyledrake 9 hours ago 1 reply      
I first learned about this after starting https://neocities.org and seeing a bunch of really garbage pages that were full of random text that linked to a derpie site somewhere.

We get pagerank SEO spam from time to time, and it's pretty annoying. I have the tools to take care of it within 5 minutes every day, but I do worry that if we grow to a certain point it may no longer be possible for me to handle the problem alone.

I'm sure many other sites have similar problems with comment spam, and I'd love to hear some advice on how to deal with this from sites that have the same problem.

Right now our main lines of defense are a recaptcha (our last remaining third party embed, ironically sending user data to Google I'd rather not send to deal with a problem Google largely created), and a daily update of an IP blacklist we get from Stop Forum Spam.

I tried to do some Bayesian classification, but didn't make much progress unfortunately. And nofollow really isn't an option for me, as it would involve me manipulating other people's web sites and I don't want to do that.

al2o3cr 9 hours ago 0 replies      
Better title: "How SEO Asshattery Turned The Web To Shit"
_yosefk 10 hours ago 0 replies      
"How gravity ruined flying"? PageRank looking at links isn't some arbitrary thing, it's a source of information every good search will take into account.
hartator 10 hours ago 0 replies      
I think it's odd to perceive the end of a relative transparent metric - whatever relevant or not it has been - as a good thing.
rgovind 9 hours ago 1 reply      
Does anyone here think we need a search engine which lets us maintain large blacklists of websites. For example, if I am searching for information about airbnb, I do not want news websites like NY Times, WSJ, Forbes, Business standard etc to show up in the results at all. Any business related question on India is invariable dominated by Times of India and other newspapers. With google, its becoming increasingly difficult to filter out websites.

Edit: Changed "on airbnb" to "about airbnb"

kin 8 hours ago 0 replies      
Just because I can't see the score doesn't mean I'm not going to what I can to increase it.
kazinator 10 hours ago 0 replies      
What difference does it make if the semantics of PageRank are still in place for determining position in the search index, but it is just hidden?

You can still infer the approximate rank of a page by where it places relative to other pages, when searching for relevant keywords. Someone wanting to place ahead of the competition still has a function for measuring how well they are doing in SEO.

hackuser 5 hours ago 0 replies      
Another way to look at this is a blow to openness and a concentration of Google's power. The PageRank scores still exist, but they now will be known only by (some? all?) Google employees.

Therefore, the data is no longer open and power is now more concentrated: Those who know someone at Google can find out their page rank score; the 99.999...% of the rest of the world cannot.

vorg 5 hours ago 0 replies      
Reading this makes one realize how easy it was for Groovy's Tiobe ranking to jump from #82 to #17 in just 12 months, as shown at http://www.tiobe.com/tiobe_index?page=Groovy , and the other spikes in its history.
gchokov 9 hours ago 0 replies      
Indeed, one of the things I don't like (hate?) Google about is the SEO and PageRanking BS. All pages in the last 10 years are starting to look the same. All pages are becoming what Google wants them to be.
runn1ng 7 hours ago 1 reply      
PageRank is still visible today?!? Where? (I am just curious, I thought it's not visible anywhere for years)
VikingCoder 9 hours ago 1 reply      
I just lost a ton of respect for Danny Sullivan.

Every system can be gamed. Every system where money can be made WILL be gamed. It's a predator-prey relationship.

The way this article was written made it sound like Google Search was a bane when it arrived. And sure, it was the worst Search Engine at the time, except for all the others that had been invented up until then.

Ping Stick sensitiveresearch.com
24 points by l1n  3 hours ago   4 comments top 4
veenified 1 hour ago 0 replies      
Video demonstration is available for download here: http://sensitiveresearch.com/Ping%20stick/images/wall,%20edg...
foota 1 hour ago 0 replies      
Seems like it could make a useful aid to people with vision impairment?
arde 46 minutes ago 0 replies      
This would be a great phone app.
daodedickinson 1 hour ago 0 replies      
I accidentally clicked on the up arrow for this article. Then I decided to check the article to see if it was worth an up vote anyway.

>This page uses a plugin that is not supported

I guess my Chrome doesn't do .mov files in embed tags.

Well anyway, as far as embodied cognition goes, I believe that, in a certain sense, a star is part of my body while I'm looking at it. Post-it notes are part of our memory.

Seems interesting. Reminds me of the glasses that flip your vision upside down until your brain flips it back right side up and then when you take the glasses off you see upside down with no glasses. And also of blind people echolocating.

Google joins Open Compute Project to drive standards in IT infrastructure googleblog.com
121 points by rey12rey  9 hours ago   24 comments top 5
Animats 14 minutes ago 0 replies      
Finally, racks go metric.

The 19 inch rack is one of the oldest standards in computing. ENIAC used 19 inch racks. Open Compute, though, uses wider, and metric, racks. 19 inch rack gear can be mounted in Open Rack with an adapter.

The Open Compute spec says that shelves of IT gear are provided with 12 VDC power. There's power conversion in the base of the rack. Facebook has standards for distribution to the racks at 54VDC, 277VAC 3, and 230 VAC (Eurovolt). Apparently Google wants to add 48VDC, which was the old standard for telephone central offices.

Facebook's choice of 54VDC distribution is strange. Anyone know why they picked that number?

wyldfire 9 hours ago 2 replies      
> energy efficient and more cost effective ... engaging the industry to identify better disk solutions for cloud based applications

My pet issue w/IT infrastructure is the management modules. Finding a server w/a management module that works everytime is nigh impossible. Do google and Facebook design their own or do they somehow just work around their quirks?

atomic77 9 hours ago 3 replies      
I haven't followed this project too closely, but, it seems interesting that it has taken this long for Google to join (and the conspicuous absence of Amazon). Anyone able to speculate on why now?
godzillabrennus 7 hours ago 1 reply      
This is great news and just another nail in the coffin of what Wired calls the Fucked By Cloud vendors: http://www.wired.com/2015/10/meet-walking-dead-hp-cisco-dell...
wilhil 5 hours ago 1 reply      
And, as not an employee of a multi billion pound company, how can I get involved?!

I ask every time, and this project is amazing, but, it feels just for the big guys!

A look at the Technology behind the 4D Game Miegakure [video] youtube.com
16 points by krmkaos  2 hours ago   4 comments top
simonebrunozzi 53 minutes ago 3 replies      
What the hell is a 4D game? I just hate buzzwords.
Lets Encrypt client will transition to a new name and a new home at EFF letsencrypt.org
253 points by riqbal  13 hours ago   31 comments top 6
riscy 10 hours ago 0 replies      
> Another reason is that we want it to be clear that the client can work with any ACME-enabled CA in the future, not just Lets Encrypt.

Great to see that they are actively aware of CA monopolization, and taking steps to avoid becoming one themselves.

heavenlyhash 10 hours ago 4 replies      
Anyone looking to use Let's Encrypt and free to make choices regarding their server may want to check out https://caddyserver.com/ -- it has Let's Encrypt support baked right in.
waskosky 1 hour ago 1 reply      
If you were like me and holding out on Let's Encrypt until Windows XP is supported (even Chrome is still broken on XP) it looks like a date of March 22nd has been set for "getting new cross-signatures from IdenTrust which work on Windows XP."



desireco42 7 hours ago 1 reply      
Let's encrypt really helps get ssl everywhere. It is not super easy to set it up, but I am sure this will get better as time goes, this is huge.
_jomo 12 hours ago 1 reply      
mioelnir 8 hours ago 4 replies      
Missed opportunity to move beyond the reach of NSLs.
Graph Databases 101 cray.com
8 points by BooneJS  1 hour ago   discuss
Modern concurrency tools for Ruby github.com
79 points by sciurus  7 hours ago   3 comments top 3
cschneid 50 minutes ago 0 replies      
I've used this library (the promises part) and it works fine. Error handling as a proc is mildly awkward, but worked out fine.
mattiemass 2 hours ago 0 replies      
Shame on me for thinking that you needed other languages for good concurrency. This looks awesome!
tomc1985 4 hours ago 0 replies      
This is cool! Both to use, and to study. Thank you!
What Kind of Lithography Will Be Used at 7nm? semiengineering.com
45 points by Lind5  5 hours ago   11 comments top 3
asmithmd1 4 hours ago 1 reply      
At 7nm spacing you could draw 100,000 lines across the width of a human hair. About 70 silicon atoms lying next to each other would be 7nm wide. I know the end of Moore's law has been predicted before, but this time has to be different.
xlayn 5 hours ago 2 replies      
Interesting how many chip-makers have now the power (read it as financial and human resources) to keep the race at what it seems the same level.Of them all, Apple is the more impressive as they have keep the core count down.On my books been able to squeeze another 4 cores to 12 cores doesn't make that much sense on cellphones and laptop/desktops.
pnut 5 hours ago 0 replies      
Machine elves, obviously.
First Preview of Android N: Developer APIs and Tools android-developers.blogspot.com
140 points by krat  10 hours ago   56 comments top 12
ubertaco 10 hours ago 3 replies      

>Improved Java 8 language support - Were excited to bring Java 8 language features to Android. With Android's Jack compiler, you can now use many popular Java 8 language features, including lambdas and more, on Android versions as far back as Gingerbread. The new features help reduce boilerplate code. For example, lambdas can replace anonymous inner classes when providing event listeners. Some Java 8 language features --like default and static methods, streams, and functional interfaces -- are also now available on N and above. With Jack, were looking forward to tracking the Java language more closely while maintaining backward compatibility.

pjmlp 10 hours ago 2 replies      
Besides finally saying something about Java support, I found other items interesting.

ART will recompile applications based on profiling data.

Introduction of support to hardware keystores, with the mention that one use case is to prevent jailbreaking.

Prevent the NDK users that ignored the documentation and linked to non official platform libraries to keep doing that.

trequartista 10 hours ago 1 reply      
Extremely interested in the multi-window support. With phone screens getting bigger than ever, this could be a very useful feature for multi-tasking
RivieraKid 9 hours ago 0 replies      
I wish they'd add grouping to the window switcher, something similar to what's in WebOS. Because it's somewhat difficult to implement tabs without cluttering the UI in appssuch as web browesers, reddit clients, mail clients, etc.
riskable 8 hours ago 4 replies      
I was really hoping for something more exciting. Like maybe native support for a new, different language (like Apple did with Swift).

It would be absolutely amazing if Google came out with a mechanism for building native apps in, say, Rust.

AdmiralAsshat 7 hours ago 2 replies      
I'm disappointed to see that it will take Android N for the Doze feature to be practical [0]. As it currently stands, Doze only activates when the device is stationary. My phone never leaves my pocket, since I'm paranoid about setting it down, so the gyrometer being engaged is enough to prevent Doze from triggering.

[0] http://www.androidpolice.com/2016/03/09/android-n-feature-sp...

Wonnk13 10 hours ago 0 replies      
the $150 discount on Pixel C is really telling. It's clear that revenue is higher for tablet optimized apps and games and Android really needs to step up to compete with Apple.

I'm just a hobby developer, but the last four-ish months have been really exciting. There's a been ton released and polished in the developer console alone.

fulafel 8 hours ago 0 replies      
I wonder why the Pixel C developer discount is us-only.
tdkl 7 hours ago 0 replies      
Hope the faster release will also show in a faster update cycle for vendors. Or some kind of guarantee that devices who got/get M, will also get N. The quick reply API and notification tweaks are pretty great and doze is now useful (seems like a something like Sony Stamina mode).
geodel 10 hours ago 1 reply      
So Java 8 support is finally here. I think this is the effect of Android moving to OpenJDK.
oDot 7 hours ago 3 replies      
I really wish they would get the camera and gyro APIs on par with iOS so Instagram could port Hyperlapse
creshal 10 hours ago 1 reply      
I guess I'll take a look at the APIs in 5 years, when it has enough market share to be worth considering.
Show HN: Security Training for Developers hacksplaining.com
48 points by malcolmhere  5 hours ago   15 comments top 11
billyhoffman 1 hour ago 0 replies      
Slick and a nice UI, but the security advice in this is just plain terrible.

Blacklist input validation as defense against XSS? Are you kidding me? And then over to session fixation, where I see the exact same ?jessionid=blah example that has been in any Web Security book for the last 10-15 years? Come on!

cpcarey 2 hours ago 0 replies      
I'm enjoying this a lot. The explanations are straightforward and the writing and animation style is entertaining. I'm liking the website parodies and the puns in the alt texts. I'm learning new things and the linked resources are good for going in-depth. I'd probably pay for advanced lessons in this style. I'll be recommending to friends!
greggh 43 minutes ago 1 reply      
This is so beautiful that I wish it was good advice, but it's not. Some of these examples actually introduce problems. SHA-256? Really?
barbs 1 hour ago 1 reply      
At a glance this seems to be aimed mostly at web developers. How much of this would be relevant for a native mobile developer like myself?
zmitri 1 hour ago 0 replies      
Enjoyed this a lot. Great starting point for anyone interested.
michaelbuckbee 2 hours ago 0 replies      
I like Troy Hunt's web security stuff - I'd gotten into it on Pluralsight, but then moved jobs and don't have access. I did find a free course (With SQL Injection, etc.) of his here: https://info.varonis.com/web-security-fundamentals
bsrx 3 hours ago 0 replies      
CiPHPerCoder 3 hours ago 0 replies      
Went through the SQL injection demo, and it recommends parametrized queries. Excellent.


Joined with Github, went through the password handling section, then saw this:


No no no no NO! Do NOT use SHA256 for passwords.



PBKDF2-SHA256 with 100k or more iterations? Okay, fine.

SHA256 the cryptographic hash function not designed for password storage? Bad advice.

bsrx 3 hours ago 1 reply      
Any comments on who put this together, or their long term goals?
cphoover 3 hours ago 0 replies      
very well put together
SandersAK 3 hours ago 1 reply      
Awesome! This is great!
Train your own image classifier with Inception in TensorFlow googleresearch.blogspot.com
93 points by rey12rey  10 hours ago   5 comments top 4
masonhipp 5 hours ago 0 replies      
This is awesome --> "In order to make research progress faster, we are additionally supplying a new version of a pre-trained Inception-v3 model that is ready to be fine-tuned or adapted to a new task. We demonstrate how to use this model for transfer learning on a simple flower classification task."

Fine-tuning these models for different applications has been a great way for me to build out new things without relying on an enormous fleet of K40s to train a new set from scratch. Lots of progress in this field, thanks to the whole team for releasing this.

jszymborski 8 hours ago 1 reply      
This is great! I was just writing a small script to prepare data for TensorFlow CNN image classification based on a custom dataset using SciKitFlow, but the InceptionV3 model is super cool and it looks like the have an implementation with almost compatible API to what I was writing [1].

I'm super impressed by what's coming out of Google's TensorFlow. Their ImageNet InceptionV3 model is a delight to play with in python!

[1] https://github.com/tensorflow/models/blob/master/inception/d...

aub3bhat 6 hours ago 0 replies      
This is a great news, the earlier released model had some limitations such as it could not be used with a batch. With this and multi GPU training, TensorFlow is now a good alternative to Caffe.

Here is my project with TensorFlow inception model.https://github.com/AKSHAYUBHAT/VisualSearchServer

ganeshkrishnan 2 hours ago 0 replies      
Is anyone using Java for running the TensorFlow examples?
Fake and cheap 3D metaball edankwan.com
21 points by tobltobs  4 hours ago   3 comments top 2
Impossible 2 hours ago 0 replies      
This is pretty cool. The technique reminds me of a similar rendering trick outlined in this Nvidia presentation (http://developer.download.nvidia.com/presentations/2010/gdc/...)
tobltobs 3 hours ago 1 reply      
Show HN: Open source stackoverflow-like service paizaqa.herokuapp.com
5 points by yoshiokatsuneo  55 minutes ago   discuss
Facebook Advertising Strategies for Early-Stage Startups interstateanalytics.com
100 points by jamiequint  9 hours ago   25 comments top 5
physcab 7 hours ago 1 reply      
I also have similar experiences running ad budgets ($5M / month at highest). I built tools to do ad optimization on FB using attribution as a backbone and optimizing for ROI. I learned the hard truth with FB advertising when I built my own iPhone app last summer and thought all this experience would help me. Turns out when you're spending $2500 per day per campaign its easier to sit back and optimize, but when you're spending $100 in a month its completely different.

Turns out that the first thing you need to do is figure out if FB is the right channel for you. I found out that on FB, anyone will download anything that looks interesting and you can optimize your CPI fairly easily. But if you count on people spending money via in app purchases, the typical rules (1%-5% of active users) don't always apply for apps of different genres.

shostack 8 hours ago 2 replies      
The bit about attribution is critical, but misses a more important basic piece which is "make sure you setup and properly QA your conversion tracking."

So many companies launch FB ads without proper tracking and then are surprised when they have no idea what it did for them. FB tends to group everything under the sun as "engagement" and "conversions", so really digging in and understanding those settings is key.

For example, 1-day view-through credit by default is probably a bad idea for many advertisers, particularly when you have no clue what the quality of a view-through is, and what they are worth to you. They are VERY different in terms of value, but FB wants to give them 100% credit with their rules within 1-day. That's simply not how most savvy people approach attribution.

Google Analytics offers some great basic attribution tools out of the box that let you experiment and compare different static models, or create your own static model. Ultimately static models themselves have inherent limitations because attribution is a much more dynamic thing that exists at the individual user path level, but it is a great start.

cm2012 7 hours ago 1 reply      
One thing I would add is that you don't need a large list to make good lookalikes off of - some of the best lookalikes were generated from only 5,000 emails.

Lookalikes almost always outperform interest targeting.

aabajian 6 hours ago 3 replies      
Wow. I didn't know you could upload a list of your customer's email addresses to Facebook and have it build an audience of similar users. This seems highly unethical -- who knows what FB will do with your customer's email addresses.

Edit: I also wonder what would happen if I uploaded just my own email address? Would it find people very similar to me?

jgalt212 7 hours ago 4 replies      
I've been advised that advertising enterprise products (for US market) on Facebook are a bad use of resources. Then we looked at Linked In, but their pricing and targeting was disappointing.

What's the best platform to advertise enterprise software on? I think IP level targeting is a good strategy, but curious to hear about other ideas.

John Carmack on Idea Generation amasad.me
82 points by amasad  2 hours ago   16 comments top 6
pbw 59 minutes ago 2 replies      
I like the buildup here with "Antifragile". Sounded very cool and I was eager to see how it would apply to "Idea Generation". But Carmack's 5 steps seem to boil down to "Try to tear down ideas before you implement them" don't they? Just needlessly expressed as 5 separate steps?
smallhands 11 minutes ago 0 replies      
any chance of watching this carmack facebook video?
halayli 44 minutes ago 0 replies      
> Proprietary software is usually used in controlled environments all the while building up fragility for a major catastrophic event waiting to happen

Not necessarily true. OSX, Oracle, MS etc.. are used by many and are under constant stress test.

rdudekul 1 hour ago 0 replies      
"Failure events must end up making our system stronger. Meaning when an idea fails it needs to make the overall system better."

"As soon as you get an idea you try to defeat it. Youll be able to generate more ideas because you freed up mental space."

These are good ideas, both from coding as well as entrepreneurial perspectives. Every failure you encounter, if taken as a learning lesson, will make you or your ideas stronger.

nevir 45 minutes ago 1 reply      
How much of this hinges on having enough tooling to be able to efficiently explore/prototype/abuse your ideas?
programminggeek 1 hour ago 5 replies      
I tend to write my ideas down in a Google doc as a way of throwing them away. Then I keep going on my current one focus. If the idea is good enough, I always manage to come back to it and finish. If I don't, it probably sucked.

I wrote a more comprehensive essay on ideas here: http://brianknapp.me/books/creative-pursuit/chapter-7/

White House's Claims That the TPP Would Curb Internet Censorship Are Fantasy eff.org
127 points by DiabloD3  7 hours ago   10 comments top 5
gpm 5 hours ago 0 replies      
It's only semi related, but Michael Geist has been going through the TPP and writing about problematic parts of it for Canada. He is very focused on Canada, but I imagine a similar list could be generated for any non-american country. Below is a link to the latest post, which contains links to the rest at the bottom:


fweespee_ch 6 hours ago 2 replies      
As much as like I EFF's coverage in general:

> And we will continue to press our partners to allow digital information to cross borders unimpeded. We are working to preserve a single, global Internet, not a Balkanized Internet defined by barriers that would have the effect of limiting the free flow of information and create new opportunities for censorship.

This is technically correct. It doesn't create new opportunities for censorship. These opportunities always existed.

> The TPP illustrates these shortcomings well. Its free flow of information rules would only be enforced for foreign enterprises, and only those entities based out of countries that have signed the TPP. So if a country were to enact a law banning some type of online content, the TPP's free flow of information rules would do nothing to prevent the enforcement of that censorship against websites or platforms that are locally-owned in that country.

Yes and nothing prevents that already. The TPP doesn't alter the sovereignty equation, it is a trade treaty.

The TPP makes things worse but such weak attacks against statements that are technically correct are largely ineffective.

Posts like these are much more effective as they show legitimate problems that could have been resolved favorably (for the general population) in a trade treaty:



snowwrestler 1 hour ago 0 replies      
I find the EFF to be so weird sometimes. They do great, essential work on protecting encryption, but then occasionally post outright dumb stuff like this (from the linked post above):

> Its free flow of information rules would only be enforced for foreign enterprises, and only those entities based out of countries that have signed the TPP.

Yes, because it's an international trade agreement.

afarrell 6 hours ago 1 reply      
Does an annotated copy of the TTP exist anywhere?
Esau 5 hours ago 0 replies      
I am convinced that this administration would say just about anything to get this passed.
EmacsGolf (2013) jcarroll.com.au
42 points by lelf  7 hours ago   5 comments top 4
binaryblitz 14 minutes ago 0 replies      
So while vim is my terminal editor of choice, I use Sublime for programming. I can do the challenge in 12 keystrokes, 11 if you don't have to delete a trailing line.

1. control+g (Goto line number)2&3. 11 (line 11)4. enter5. cmd+shift+down6. cmd+x7. backspace (delete trailing line)8. cmd+shift+up9. cmd+shift+l (multiple cursor mode)10. right arrow (line up cursors)11. tab12. cmd+v

girzel 22 minutes ago 0 replies      
That's where these comparisons always fall down: you're trying to do something in X number of keystrokes in vim, whereas in Emacs you'd write your own function to do whatever it is. How do you compare those?

Coincidentally (or not), re: the challenge in the above post, I wrote a command that solves exactly this problem: yank-interleaved. https://www.emacswiki.org/emacs/KillingAndYanking#toc3

How many keystrokes does it take to install and use that function?

omaranto 2 hours ago 0 replies      
Tim Visher recorded a series of videos of himself playing VimGolf in Emacs [1]. If you want to play there is a great VimGolf client for Emacs [2].

[1] https://vimeo.com/timvisher/videos

[2] https://github.com/igrigorik/vimgolf

melling 6 hours ago 1 reply      
I started an Editor Rosetta Stone of sorts.


Emacs and vim have been battling it out for 40 years. There's probably room to improve the typing efficiency of both.

Emacs has ergoemacs and god-mode, for example.



Plus you can use vim key bindings.

Is a modal editor better? Should key bindings be built around a more efficient keyboard layout, like Programmer Dvorak, for example?


       cached 10 March 2016 05:02:01 GMT