hacker news with inline top comments    .. more ..    21 Oct 2015 News
home   ask   best   3 years ago   
BMW i8 in WebGL playcanvas.com
260 points by antouank  4 hours ago   73 comments top 23
1
elcct 9 minutes ago 0 replies      
You wouldn't download a car?
2
phoboslab 3 hours ago 1 reply      
Really nice work. This doesn't seem to an official site of BMW, though!?

Porsche[1] and Renault[2] did something similar lately. It's great to see WebGL used for this in production. Honestly, I'm surprised it took so long. Visualizing cars with WebGL seems like a no brainer, especially when most current websites load dozens of images for their "360 views".

[1] http://www.porsche.com/microsite/911/germany.aspx#showroom/9...

[2] http://www.littleworkshop.fr/renaultespace/

3
jasonkester 3 hours ago 6 replies      
To whoever it was a few weeks back who said it looks like the i8 was "giving birth to a Porsche", you were right: Once you've seen it, you can't unsee it.

As this visualization demonstrates in nice 3d form.

4
iMark 3 hours ago 2 replies      
I remember waiting 30 minutes for a crappy vrml model of the Enterprise to download over a 28k modem years ago.

This is both impressive, and makes me feel very old.

5
dvh 3 hours ago 1 reply      
When you enter and steer the driving wheel and then get out, the wheels are also turned :)
6
fsloth 2 hours ago 6 replies      
I am perplexed. Modern games are fantastically more visually appealing and contentwise impressive than this. Modern films utilize computer graphics that are far more impressive still. What is it about single model viewer that makes it headline material? It looks nice, though.
7
atrust 19 minutes ago 0 replies      
Well done. This one reminds me http://helloracer.com/webgl/.
8
jaegerpicker 1 hour ago 1 reply      
It's a cool demo but I wish it wouldn't autorotate. Makes clicking the door interaction point much more difficult than it should be. A pretty picture over usability decision.
9
forgotmypassw 2 hours ago 3 replies      
The scene looks very nice albeit I'm not sure what's so spectacular about displaying a single model? I've noticed someone saying in the thread that it's impressive because you can view it on mobile, but why wouldn't you? Modern phones are equipped with hardware better than the one we had over a decade ago, and we had 3D games back then, so displaying a single higher poly model on way better hardware doesn't really amuse me. Besides, I don't really dig the whole idea of WebGL, but the damage's done already.
10
stefap2 1 hour ago 0 replies      
In Firefox I get "Loading...". In IE it shows only a blurry picture of what appears to be the front lights?
11
JanSolo 1 hour ago 2 replies      
I LOVED the Orangey-Brown colour scheme so much that I went straight to the BMW website to see if it was really an option that you can buy. Sadly, it's not. /sadface
12
apocalyptic0n3 1 hour ago 0 replies      
In case anyone is having similar issues, this loaded a LOT better in Firefox than Chrome
13
killer4u77 24 minutes ago 0 replies      
That looks really cool
14
Elizer0x0309 1 hour ago 0 replies      
Why I love graphics programming! Math & Art come together into a holistic whole! Great work!
15
samuell 3 hours ago 0 replies      
First time in a long time that my laptop fan peaks a little :P
16
lectrick 2 hours ago 0 replies      
Ah, the car that fakes internal combustion engine noise when it's running on electric... No thanks. Nice demo though!
17
dahart 1 hour ago 0 replies      
On an iPad, I can only see the top left quarter of the view.
18
m52go 3 hours ago 0 replies      
This is stunning. Excellent work!
19
ponyous 3 hours ago 1 reply      
I love camera behaviour! Good job
20
psiops 2 hours ago 0 replies      
Very cool. The look of the turning steering wheel seemed a little off to me though. As if it's slightly off-center.
21
mixedbit 3 hours ago 0 replies      
Beautifully done. I particularly like the lights glow effect.
22
middleman90 3 hours ago 0 replies      
Impressive...Congratulations guys at PlayCanvas!
23
robodale 2 hours ago 0 replies      
Loading...
Teen Who Hacked CIA Directors Email Tells How He Did It wired.com
23 points by phesse14  29 minutes ago   19 comments top 6
1
ryandvm 12 minutes ago 1 reply      
The worst news here is that the director of the most powerful information gathering agency on the planet uses AOL.
2
mkobit 9 minutes ago 3 replies      
> The hackers described how they were able to access sensitive government documents stored as attachments in Brennans personal account because the spy chief had forwarded them from his work email.

How is this acceptable? Shouldn't he be held accountable for this kind of stuff?

3
freditup 8 minutes ago 1 reply      
How do you design a system that's hardened against social engineering but not hardened against innocent mistakes, like losing your password? It seems like the easiest way to access public systems like this is through social engineering techniques around password recovery or phishing.

Of course there are well-known answers that are used to mitigate these problems somewhat, TFA solutions, login images, etc. But I still feel as if social engineering attacks hit a really vulnerable weak spot in many systems.

(On a mostly unrelated note, can we get rid of security questions forever? I've taken to just giving nonsense answers for them and storing my answers somewhere secure. I sure don't want my passwords being reset because somebody knows my mom's maiden name...)

4
sageabilly 9 minutes ago 2 replies      
AOL doesn't support 2-factor authentication for email sign-in. If they did, then this entire debacle would have been stopped before it even started.

I'm also surprised that the government doesn't have more stringent guidelines about the private email use of its top officials.

5
ryanlol 6 minutes ago 2 replies      
Why is this even a story?

Has there been any confirmation that this account even actually belonged to the CIA director? If yes, has there been any evidence that there was actually anything sensitive on the account? (I seriously doubt the latter)

If there was nothing on the account how is this different from any of the other tens of thousands of aols that have been hijacked since the 90s?

6
fein 12 minutes ago 1 reply      
Social engineering is, and will always be, the fastest way to compromise a system.

Computers are pretty good at security; humans, especially underpaid and overworked helpdesk jockeys, are not.

Frequent Social Networking Associated with Poor Functioning Among Children [pdf] liebertpub.com
31 points by redgrange  1 hour ago   4 comments top 4
1
cyanbane 55 minutes ago 0 replies      
When I read the title I thought "Association" could mean potential causation, but I think the 2nd to the last sentence in the Abstract helps clarify:

"The findings suggest that studentswith poor mental health may be greater users of SNSs."

2
xixi77 54 minutes ago 0 replies      
Now, let's see how long it takes until we see until someone cites this as evidence that social networking cause poor functioning :)
3
lordCarbonFiber 53 minutes ago 0 replies      
I think leveraging virtual spaces to provide greater access mental health support is a great step forward to improve outcomes for troubled adolescents. This definitely opens interesting avenues for future research.
4
09bjb 25 minutes ago 0 replies      
Interesting to see how instant-gratification-type habits like endlessly browsing your news feed are the new low-grade drug habits. I'm just waiting for the study that finally proves that surfing twitter for more than 60 minutes causes a 300% higher likelihood that you'll visit a porn site in the next five.
Docker Acquires Tutum docker.com
128 points by samber  2 hours ago   36 comments top 15
1
falcolas 1 hour ago 2 replies      
I try and not comment about this too much, but the text on the Docker site is stupidly hard to read. The font color, #7A8491, has a contrast ratio of 3.8/1 (black on white has a ratio of 21/1), which is barely above the w3 accessibility standards for _large_ text (18 point, or 14 point bold - higher for the thin stroke text the Docker page uses (Helvetica Neue Thin)).

Fix this, please, Docker. A few more points towards black isn't going to destroy the look and feel of your page.

2
dpeterson 14 minutes ago 0 replies      
I did a quick smoke test to see if Tutum passes mustard. My smoke test for this kind of tool is if they have a solution for deploying mongodb as a production level service with sharding or at the very least replica sets. Like so many other lets do the easy part and stop there companies, they have a template for starting a single development mongodb node that would be easy enough to do myself. I want a tool that has a repository of templates for making formations of very hard things easy. Openshift is at least working on it: https://github.com/openshift/mongodb Their replica set version is not able to persist data permanently until Kubernetes figures out how to attach separate persistent volumes to pods in the same service. Unfortunately, Amazon is again the only game in town that does exactly what I want: https://aws.amazon.com/blogs/aws/mongodb-on-the-aws-cloud-ne....

If I want to run locally, I have to ditch Docker entirely and just use Ansible: https://github.com/ansible/ansible-examples/tree/master/mong...

3
geerlingguy 2 hours ago 2 replies      
This seems to be a good move towards a more stable revenue generator for Docker, the company, but I'm more interested in what the long-term Docker ecosystem implications are.

I think Docker is just passing through the early adopter status in terms of actual production usage (it's much more mature in its lifecycle for dev), and having one of many cloud Docker providers owned by Docker might have a chilling effect on other 'container in the cloud' providers using Docker as their primary container format/platform.

4
sciurus 54 minutes ago 1 reply      
Do I have this right?

In 2011 dotCloud launches as a platform-as-a-service company.

In 2013 dotCloud releases docker, software based on the lessons they learned building their PaaS product.

In 2013 Tutum starts to build a PaaS based on Docker.

In 2014 Docker (renamed from dotCloud) sells their PaaS to cloudControl.

In 2015 Docker buys Tutum.

5
thepumpkin1979 59 minutes ago 0 replies      
I tried Tutum a couple months ago, the onboarding experience was awesome, the free image builder was super fast, metrics of my processes everywhere I loved it... the struggle started until I deployed a real app with little workload: After a couple of hours Metrics didn't worked at all, process got stalled and the whole UI became useless because I had zero visibility of what was going on.

I switched to Heroku only to realize that I had the same problem there too, obviously it was an issue in my app but at least Heroku have me an specific R14 error code and description of what was happening and finally knew what I was dealing with. For the next 48h that I was debugging the memory leak I had my dynos switched to 1X to get even more resource metrics, once the issue was solved I switched my dynos back to hobby.

I'm considering going back to Tutum now that I have deferpanic installed and configured in my app and my Heroku bills are around 100 USD monthly(20USD SSL endpoint x 3 + 7USD hobby dynos x 3 + 22.50USD Compose RethinkDB), but I was shocked to realize how much value a mature PaaS can deliver for clients even for a hobby-ish app like mine.

6
tacotuesday 32 minutes ago 1 reply      
I'd like to know what Tatum offers in comparison to fabric8.io. It seems the "video intro" and "take it for a spin" links are the same, and not a video introduction. That's disappointing. Maybe someone in charge of the page can fix it please?

Searching for "Tatum video introduction" on a search engine only returns results about a certain movie star, which is not terribly helpful.

7
csears 1 hour ago 1 reply      
Tutum mentions they manage persistent volumes and handle mapping them to containers at runtime, presumably across different nodes/hosts.

Anyone know how that actually works? Is it similar to Flocker at all?

8
JoshGlazebrook 2 hours ago 1 reply      
I've used Tutum for a little while now and I love it. I'm just wondering about what Dockers plans are for when Tutum leaves beta. It would be nice if it stayed free. The potential pricing seems decent enough at $7/node/month, but it could be better.

edit:

updated tentative pricing image url as it looks like someone has deleted it off the tutum slack team site.

http://i.imgur.com/i5k8nkr.png

9
serverholic 27 minutes ago 0 replies      
What is the recommended development environment for this? Docker Toolbox? What's the recommended setup?
10
flowerpot 1 hour ago 1 reply      
I somehow felt like the docker team was closer with the Rancher team, so I thought docker might acquire Rancher at some point. I think this is a move to produce revenue in the future while Rancher is yet another open source project to monetize.
11
jaboutboul 2 hours ago 1 reply      
Only a matter of time before Red Hat acquires Docker...
12
joeyspn 2 hours ago 2 replies      
Well, that was fast... After the announcement of Amazon Container Registry docker had to make some move and this is a great one.

Congrats to the tutum team, they have built a really nice project that makes really easy to build and maintain container pipelines.

Can't wait to see the integrations with the other docker tools...

13
jakobloekke 2 hours ago 0 replies      
So that's why the "Sign in with Github" button was hidden behind a link this morning ... :)
14
vargalas 2 hours ago 0 replies      
Great news!
15
tomwbarlow 1 hour ago 0 replies      
Hurray! :)
How were changing Colombia through open-source communities medium.com
19 points by robermiranda  1 hour ago   discuss
The Collapse of the US-EU Safe Harbor microsoft.com
79 points by twsted  3 hours ago   47 comments top 10
1
buro9 3 hours ago 3 replies      
The proposals do not address the elephant in the room and the very reason Safe Harbor collapsed: The NSA and the ability of the US government to override any treaty to access any data using secret warrants.

It is that which killed Safe Harbor, and none of the proposals at the end of the article would be immune to that weakness again.

It would remain the case that the proposals made would not be in line with the clear ruling that the European court gave if the US government can continue to override international treaties and their own courts.

2
forgotAgain 2 minutes ago 0 replies      
He makes the issue more complex than necessary for the benefit of his employer. There is no reason why private information needs to move across borders without the express consent of the individual involved. At that point the individual agrees to be bound by the rules of the country where the data is going or no transaction is done.

Let each country have it's own set of rules and have all countries respect those rules for data located in the hosting country.

The idea that each country must be exactly the same and data is by default available for transmission across borders is only to the benefit of multinational companies.

3
hopeless 2 hours ago 0 replies      
Microsoft are at the centre of another case which will really decide how badly EU-US trade is affected:

> Microsoft stands in contempt of court right now for refusing to hand over to US authorities, emails held in its Irish data centre. This case will surely go to the Supreme Court and will be an extremely important determination for the cloud business, and any company or individual using data centre storage. If Microsoft loses, US multinationals will be left scrambling to somehow, legally firewall off their EU-based data centres from US government reach.

from http://www.irishtimes.com/business/ecj-ruling-on-irish-priva...

At the moment, data can be held within the EU by US companies and it's all ok. If Microsoft is forced to hand over emails stored within the EU to the US government, then all bets are off.

In that future, it may not even be enough to have an EU-based subsidiary of a US company hold data within the EU, since it'll have been shown that the U.S. government can coerce them.

And we like to talk about large companies like Microsoft, Apple, Facebook, Google etc. But they can throw money, lawyers and engineers at this problem. But the thousands of US-based SaaS apps do not have that luxury. Likewise, there are thousands of EU-based small SaaS app that will have everything from their hosting stack, to their bug tracker, to their communications tools taken off them.

4
grabcocque 1 hour ago 0 replies      
The ECJ was simply responding to a fairly obvious and fundamental problem: your private data IS NOT SAFE in the US. The US government doesn't care and has no intention of changing this, so expect the ECJ's ruling to stand for a long time.
5
jacquesm 3 hours ago 1 reply      
Color me impressed with the no-nonsense and respectful way in which Microsoft tackles this. Looking forward to other tech giants following suit.

The 'privacy is dead' crowd should really take notice of this article.

6
BuildTheRobots 2 hours ago 1 reply      
Site seems down. Copy/paste from google-cache here (too big to submit as a comment): http://pastebin.com/0jLCA65D
7
devy 52 minutes ago 1 reply      
To me the key idea from Brad Smith's post, which I don't neccessarily agree with, was this:

 Third, there should be an exception to this approach for citizens who move physicallyacross the Atlantic. For example, the U.S. government should be permitted to turnsolely to its own courts under U.S. law to obtain data about EU citizens that move tothe United States...
What he really arguing is that EU should not invalidate the Safe Harbor in that it breaks the Internet and Microsoft will provide its customers data access for U.S. and EU governments under "in the most limited circumstances". In that sense, it's not something out of ordinary than what typical Microsoft's position is in this issue. They can certainly do better than that, I.E. throwing away the server side encryption key like Apple does for iOS devices so that they don't have the technical capability to give out user data even if summoned to.

8
linkregister 1 hour ago 0 replies      
Google cache: http://webcache.googleusercontent.com/search?q=cache:0BKIRj9...

I'm getting an Internal Server Error on the original page.

9
pfortuny 1 hour ago 0 replies      
The excuse that the legal rules are obsolete is a red herring.

It depends on the rules.

For example: privacy of communications has no intrinsic dependence on technology. Security of personal data requires the verification of said security (or the commitment to it), etc...

I do not know about this specific law. But just because a law is old does not mean that it is bad. And this is what Microsoft is saying.

After hundreds of years of slavery, it was abolished in the US in a single day. So what? Is this bad?

10
mtgx 2 hours ago 1 reply      
> Government officials in Washington and Brussels will need to act quickly, and we should all hope that Congress will enact promptly the Judicial Redress Act, so European citizens have appropriate access to American courts.

Well, Microsoft is wrong here to believe that the Judicial Redress Act [1] will be sufficient. The CJEU has required "essentially equivalent" privacy protections for EU citizens as they get in the EU.

The US Privacy Act does not give them that, so this Judicial Redress Act is a hit and a miss.

The US needs to pass a much stronger privacy law that is "at least" as good as the one in the EU, if it wants its companies to continue to get EU citizen data (and I assume it does). It can start by finally reforming the ECPA for the 21st century.

[1] http://judiciary.house.gov/index.cfm/press-releases?id=9455F...

The Lost History of Gay Adult Adoption nytimes.com
42 points by pmcpinto  1 hour ago   18 comments top 3
1
DavidAdams 1 hour ago 1 reply      
I'm sure that in legal circles, there are a whole host of similar "law hacking" examples. This seems like a particular ingenious approach. I'd be interested to learn about other circumstances where laws are creatively misused to achieve noble ends. (The examples where the law is misused for nefarious ends are too numerous to mention).
2
gohrt 46 minutes ago 3 replies      
The article doesn't mention adult adoption in Japan at all, which is strange. Japanese adult adoption (a centuries long-standing practice) was covered in the US news and HN in the past year.
3
GeorgeOrr 1 hour ago 3 replies      
Not that long ago people tried to argue that marriage equality wasn't needed to protect basic rights of individuals involved. They just needed to use contracts/adoption/etc.

This is a great article to remind us how bogus that argument was.

Starbucks and Fiat Chrysler tax deals 'illegal' in Europe bbc.com
79 points by tellarin  4 hours ago   73 comments top 7
1
gizmo 1 hour ago 1 reply      
Just for the record, the Netherlands has been proudly advertising their tax deals for years.

See this presentation[1], slide 12. Right from the horses mouth:

 Reason 7 [to have a holding company in Holland]: Fiscal climate: Very competitive tax climate from its far-reaching tax treaty network to the possibility to conclude socalled[sic] advance tax rulings.
Utterly blatant. And notice the logo of Starbucks next to it. The Dutch government advertises that Starbucks pays practically nothing in tax, in order to undercut other EU countries.

These tax deals usually take the form of a fixed tax guarantee: the company agrees to place their holding company in the Netherlands and pay X euros in tax for the next N years (2 to 5), regardless of their actual revenue or profit. For the Dutch government this is just free tax revenue and if they don't make a sweetheart deal with the multinational the holding company would end up in Luxembourg or Ireland instead. This way the multinational can make the countries fight for the most preposterously low offer.

[1]: https://www.rijksoverheid.nl/binaries/rijksoverheid/document...

2
jedrek 4 hours ago 4 replies      
Good. I don't understand why companies need to be bribed to come to markets. Want to sell your goods, great, pay the same taxes every other company pays. This is nothing more than government subsidies for specific private companies.
3
bpodgursky 2 hours ago 6 replies      
In 100 years we're going to look back and realize what a waste of time and effort corporate income taxes are. And what amount of time was wasted trying to levy and avoid them.

In the end, you can recover the same money by taxing dividends and income accordingly, and it's far more difficult to hide those (a person's residency is less ambiguous than a corporation; and while you can try to play games, with proper enforcement you will end up in jail for doing it).

Of course any politician will get castigated for suggesting removing the tax entirely, but that's just politics and not sound economic policy.

4
hellofunk 3 hours ago 0 replies      
There are a lot of stories lately that throw the idea of "union" in "European Union" into question. The tax discrepancies are just part of the picture; the varied immigration policies are also getting a lot of attention. Hopefully the EU can find a way to see itself as a truly unified whole, which would be of great benefit to the world.
5
ricw 2 hours ago 3 replies      
Up to 30 million repayments since 2012? Is that it? these companies have billions of revenue and yet that's all they have to pay? How about fining them and the countries for doing so? This just seems wholly unfair.

This may be a small step in getting global players to play fairer, but from what I can tell this is still cheating the system and depriving countries and their citizens of badly needed tax income. All while competing unfairly with smaller non-global companies.

I don't get why the EU still hasn't managed to get this under control.

6
forrestthewoods 1 hour ago 1 reply      
Sounds like the states should be on the hook rather than the corporations. Sneaky sneaky Netherlands and Luxembourg. Betraying your own union for self gain! How unsocial of y'all.
7
SixSigma 3 hours ago 3 replies      
We should value companies not by earnings but by tax contributions.
Yahoo Talent Exodus Accelerates as Marissa Mayers Turnaround Flounders recode.net
44 points by SeanBoocock  2 hours ago   23 comments top 7
1
throwitaway1981 52 minutes ago 3 replies      
I'm curious: how bad is it to leave a new job soon after starting (because you realize the product isn't as great as what you thought it would be/too much competition in space)? When you are a high performer, you really don't want to waste everyone's time if you end up in this bad situation. What is the professional thing to do?

Asking because the article mentioned some people staying only for a few weeks. In a small startup, not seeing traction would be reasonable cause for departure since your paycheck directly depends on it. In a large company, things are not always interrelated in the short term. So there is more time to try out things. I guess the question is: how long should one try to make a new job work?

2
orf 2 minutes ago 0 replies      
> Jon McCormack: At the end of last year, Mayer scored a coup in the hiring of the Amazon star to be in charge of all of Yahoos mobile engineers. He only stayed a few weeks.

Ouch.

3
coldpie 55 minutes ago 10 replies      
What does Yahoo do?

Microsoft has Windows, Office and Xbox.

Amazon has massive web server and product distribution platforms.

Apple has OS X and iPhone.

Google has Android, search, ads, and an online office suite.

What does Yahoo do? I guess they have some long-time email users and a well-liked stock tracking platform?

4
xacaxulu 19 minutes ago 0 replies      
This is really too bad for diversity.

>>Just last week, Yahoo lost two senior women execs development head Jackie Reses to Square and marketing partnerships head Lisa Licht.

Before that, another exec once close to Mayer CMO Kathy Savitt left for a job at a Hollywood entertainment company, although sources said that was due in part to increased estrangement between her and Mayer.

5
rokhayakebe 53 minutes ago 0 replies      
Every body can sail a ship in good waters.

She has been a CEO of a publicly traded company for 3 years. One that was "seemingly" in trouble. 3 years is a long time for a company that was founded 3 years ago. Yahoo is 21 years old and still alive (and somewhat doing well).

6
api 19 minutes ago 0 replies      
I do not envy being tasked with turning around Yahoo. If I were her I'd more radically reinvent the company and buy more startups, maybe even pivot the whole thing. But it's a public company top-heavy with MBAs so you're really quite limited... her job is probably like driving a car with a dead elephant strapped to the trunk and a flat tire.
7
venomsnake 2 hours ago 0 replies      
For something to flounder doesn't it have to first exist? There was no turnaround.
Show HN: Metabase, an open source business intelligence tool metabase.com
22 points by tlrobinson  50 minutes ago   4 comments top 2
1
mark_l_watson 13 minutes ago 1 reply      
I like the idea and the use of AGPL. This license lets companies release useful software and still keep the door open for selling alternative licenses, if that is what Metabase does. I have been thinking of writing something similar for natural language access (similar to what I used in my first Java AI book fifteen years ago) to relational and RDF/OWL data stores, and Metabase provides a nice example of how to distribute a general purpose standalone query tool.

Also +1 for being a Clojure app. I am going on vacation tomorrow and am loading the code on my tiny travel laptop for reading.

2
eis 5 minutes ago 1 reply      
Any such open source tools are welcome.

How does it compare to re:dash? https://github.com/EverythingMe/redash

Metabase seems to have a GUI to construct SQL queries which re:dash doesn't. What else is different?

Western Digital agrees to buy SanDisk for about $19B bloomberg.com
69 points by jsnathan  3 hours ago   22 comments top 3
1
sschueller 1 hour ago 4 replies      
This sucks for consumers. Prices are already way to high for HDs and don't seem to come down anymore. It is as if there is some sort of price fixing going on.
2
trymas 1 hour ago 4 replies      
I always have thought that companies were bought for a price of it's market value (stock price * number of stocks), or if someone did not wanted pay all in cash they would've tried to compensate in other ways until market value is reached.

But here WD bought SD for ~85-86$ per share, when it was worth ~75$ per share. It's at least 13% more.

Does it simply mean that WD hopes that SD will rise in value rapidly? Or I imagined company buying evaluation wrong?

3
jordancj 1 hour ago 0 replies      
These companies both sound like bad western tech crossover movies from the 80's. This was bound to happen.
Fedora opens up to bundling lwn.net
42 points by perlgeek  2 hours ago   20 comments top 6
1
davexunit 1 hour ago 6 replies      
This is an unfortunate development. It's tough to fight against the tide of developers that simply don't care or honestly believe it is a good thing, but it's a fight that is worth it for better security and easier maintenance of GNU/Linux systems, which directly benefits users. Thankfully, Debian hasn't given up, nor has GuixSD that I help maintain.

I asked Tom Callaway from Red Hat about it and he said "I'm not a fan, I think its a poor decision, but I also appreciate that I might be in the minority these days." [0]

Hopefully, once enough people have been burned by the apparent convenience of bundling, we'll see the tide change. Maybe after Dockerization has run its course.

[0] https://twitter.com/spotrh/status/656677002028691456

2
Skunkleton 30 minutes ago 0 replies      
I have worked on several large applications where libraries were customized and bundled in. We would have been better off in the long term implementing the small delta we needed from the library in our application. In every case I saw, it was just an example of lazy engineering that led us to bundling.
3
vezzy-fnord 1 hour ago 0 replies      
Policies that are out of line with reality are bad policy: the war on drugs does not fix drug abuse, vagrancy laws do not fix poverty, and the war on bundling merely ensures that bundled software goes unreported.

The metaphor doesn't pan out. The third is canonizing a technical error.

4
mgrennan 42 minutes ago 0 replies      
Why Docker, why not RPMs they are not that hard to build. They have years design an work behind them. I hate bloat. 20 years ago I could get the same work done today in 1000th the memory and disk space.
5
SEJeff 1 hour ago 1 reply      
I think this makes it exceptionally hard for a distro with languages like golang, where vendoring libraries is the norm vs the exception.

There were some serious issues with this on the golang-nuts fedora ML some time ago where lsm5 was lamenting about the issues Fedora faces when upstream simply won't remove vendored libs.

6
matt_morgan 1 hour ago 3 replies      
I've used (servers and laptops and desktops) Fedora and Ubuntu for many years. Since the advent of package management, Fedora has been significantly more "just works" when it comes to anything slightly professional or complicated (of course Ubuntu has had the edge on personal multimedia), and I'll bet this practice of discouraging bundling is a huge part of that. On the other hand, Fedora usage is falling, proportionally, right? I'm not sure, but it seems like it. Anyway: tough call.
Scientists encouraging online piracy with a secret codeword bbc.co.uk
170 points by RobAley  7 hours ago   151 comments top 26
1
kragen 6 hours ago 6 replies      
The only "online piracy" I see here is when Elsevier demands US$30 from you to get a copy of a paper written by scientists paid for by your tax dollars, who paid Elsevier page fees to publish it. Elsevier and similar companies are the thieves here, and they have a hell of a lot of nerve to be accusing scientists of "stealing" and "piracy" for working to create the very knowledge Elsevier shamelessly exploits. (Even Elsevier's very name is a theft: they are attempting to free-ride on the goodwill of the Elzevir family of Renaissance publishers, who have no connection with them.)

Do they have the law on their side? Yeah. So did the Pope when he sentenced Galileo to life in prison for promoting heliocentrism. That doesn't mean they're in the right; that means the law is in the wrong.

2
fermigier 6 hours ago 1 reply      
In France, there is a proposal, backed by an overwhelming majority of scientists, to mandate free and open access to scientific research results.

See: https://www.republique-numerique.fr/consultations/projet-de-...

Hopefully this could end up in the Law next year.

3
xefer 4 hours ago 2 replies      
I've had over 100% luck emailing papers' authors directly asking for a copy of a particular paper I've been interested in reading. I typically get a PDF emailed back to me.

I say "over 100%" because several times I've had hard copies sent for whatever reason with hand-written letters thanking me for expressing interest in their research and letting me know they'd be happy to answer any questions, etc.

I've generally found that some researchers, especially in relatively arcane areas are very pleased to find people who are genuinely interested in their work.

I only appeal to authors directly if I'm unable to access a paper online through my library's JSTOR access which is fairly extensive.

4
amateurpolymath 8 minutes ago 0 replies      
Economists Ted Bergstrom and Preston McAfee (currently at Microsoft) have long studied journal pricing. Here is Ted's page on the matter: http://www.econ.ucsb.edu/%7Etedb/Journals/jpricing.html

His table of particularly overpriced journals in economics is dominated by Elsevier journals: http://www.econ.ucsb.edu/~tedb/Journals/roguejournals02.html

Hopefully we see more academics collectively abandoning such journals like Knuth and the Journal of Algorithms board and these other examples from Ted's website: http://www.econ.ucsb.edu/%7Etedb/Journals/alternatives.html

6
imrehg 6 hours ago 1 reply      
There's also /r/scholar[1], which does the same thing, and so far working really very well (for me as a physicist out of academia at the moment)

[1]: https://www.reddit.com/r/scholar

7
ckozlowski 5 hours ago 1 reply      
Apologies in advance, but when I saw this link, I expected to find an article with a non-nondescript phrase ("Blue Iguana" or some such) that would tip people off to meet in an unlisted IRC room or some such.

I realize not everyone is on top of internet culture and slang, but reading "#icanhazpdf" is a "secret codeword" makes me wonder if the whole piece is tongue-in-cheek ("I am shocked, absolutely shocked to find gambling in here!") or if the author really has discovered the internet for the first time.

Just bemused.

8
ajuc 6 hours ago 3 replies      
Good for them.

Living in developing country you learn to ignore copyright or you never learn anything. I don't know if it was invented as a way for developed countries to keep competive advantage, but it sure would work that way if people actually obeyed.

9
Blahah 4 hours ago 1 reply      
#icanhazdf, Sci-Hub, libgen, etc. are all symptoms of the disease. Science is in something like turmoil as it adjusts to the internet. Of course, the rest of the world has already adjusted to the internet - science hasn't because publishers have used their monopoly over our scientific knowledgebase to systematically prevent progress.

Some food for thought: science is mostly funded by public money. A small portion of that money goes to paying scientists - the rest goes on products and services bought in the process of research. Some of these are necessary. But publishing takes a large chunk of that funding stream - they charge us thousands of dollars to put articles we write on their website. In almost all cases they add no value at all. Then, they charge us, and anybody else, to read what we wrote.

But maybe it just costs that much? There are two issues here: firstly, for-profit academic publishers have some of the highest profit margins of any large business (35-40%). Secondly, they are charging thousands of dollars for something that with modern technology should be nearly free. They are technically incompetent to the extreme - not capable of running an internet company that really serves the needs of science or scientists.

They systematically take money that was intended to pay for science, and they do it by a mixture of exploiting their historical position as knowledge curators and abusing intellectual property law. They also work very hard to keep the system working how it is (why wouldn't they? $_$) - by political pressure, by exploitative relationships with educations institutions, by FUD, and by engineering the incentive structure of professional science by aggressively promoting 'glamour' and 'impact' publications as a measure of success.

The biggest publishers are holding science back, preventing progress to maximise their profit. We need to cut them out, and cut them down. Take back our knowledge and rebuild the incentives and mechanisms of science without them being involved.

10
dombili 3 hours ago 0 replies      
It's a shame that people whose job is to advance humanity have to spend their time dealing with crap like this.

I'm glad they've found a workaround but that being said, opening a PDF attachment coming from god knows where isn't the best idea. I hope they're being careful.

11
OlafLostViking 6 hours ago 3 replies      
I'm in the lucky position to have access to most publications legally. But I cannot imagine what to do if our library wouldn't have subscriptions. The prices most publishers are demanding are insanely high and simply not financable if you need just a dozen papers or so.

Especially considering that the research and the the writing is done by scientists, the review is done by other scientists. For free. The writers even pay a lot of money to get published. So I wonder what justifies these price tags for offering a PDF for download.

Don't get me wrong - I can still see the role of a publisher in the scientific world. But perhaps the monetarization should be overworked... As the article said: let's see how this whole publishing world will change. Open Access and comparable models are becoming more and more popular.

12
mikegerwitz 2 hours ago 0 replies      
Fundamentally, we're talking about the dissemination of knowledge. Yes, it is copyright infringement, but calling this "piracy" immediately associates this act with both theft and brutal disregard for the law.[0] That is not what is happening here.

With that said---I'm a Nature subscriber, and I'm pleased to see the emphasis on "Open Access" by many scientists and organizations. Hopefully this trend will continue, and silly issues like individuals requesting PDFs from fellow scientists won't be termed "piracy".

[0]: https://www.gnu.org/philosophy/words-to-avoid.en.html#Piracy

13
baghira 3 hours ago 1 reply      
It puzzles me that the most significant problem with open access receives little mention, in discussions on HN: it changes the incentives structure of publication, from one where the publisher has to please the ones buying the journal to one where the it has to please the people paying to submit articles.

This is what makes the situation profoundly more complex compared to other application of copyright, say in the software industry, where clearly switching to an open source model doesn't change the incentives i.e. who assesses the quality of software.

The long term effects on academia of switching to a model where the taxpayer gives money to scientists to pay for open access submssion of their research are hard to evaluate, and do no get enough though (imho).

That clearly doesn't mean that there aren't bad journals that are not OA, nor that for the benefit of the public some sort of arrangement shouldn't be found for older research: I'm a big believer in "faster decaying" copyright in general, and mandating that all publications describing research that is publicly funded become OA after, say, 30 years, would help significantly.

14
yati 54 minutes ago 0 replies      
I've never published a paper, and can't understand why we need actors like Elsevier and other paywalls for scientific research publication. What motivates scientists to use a publisher's services? Can't these be replicated by setting up a government publication house?
15
chrisBob 1 hour ago 0 replies      
I upload my papers through Researchgate. I know that it may not be legal to do so, but it is password protected, and hasn't been challenged by too many publishers. Sharing this way makes great sense for the author. You want people to read your paper, and it gives a way to do so. You must create an account, but many papers that would otherwise be blocked can be found this way.

The other trick I recommend people try if they frequently have trouble finding papers is to try EndNote. It is a little expensive, but I found it to be great at finding papers that I couldn't get through the official sources with my school's access.

16
alkonaut 5 hours ago 0 replies      
I don't see any problem with having Elsevier manage publications that prevent people from copying their content. Just as long as that content is also available elsewhere for free, if it's publicly funded research.

I assume the problem is that Elsevier doesn't much like when articles are also made available outside their publications? Well, then either starve them of all publicly funded content or just have them accept that all the publicly funded content will always be available outside their publications. It's as simple as that.

A proposal requiring that publicly funded research is publicly available would be how hard to pass in as law? Why aren't such proposals made? If they are, what has stopped it from already being law?

17
omginternets 6 hours ago 0 replies      
I'd love to see "popcorn time for scientific publications". Hint, hint.
18
robotkilla 3 hours ago 1 reply      
> The original tweet is deleted, so there's no public record of the paper changing hands.

Why is it assumed that there is no public record of the paper changing hands? They tweet the request publicly, so it stands to reason that someone is paying attention and archiving. I suppose the key word here is "public", but I'm not sure why that matters if the goal is covering up illegal activity.

19
baldfat 4 hours ago 1 reply      
Peer Review = Flawed http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1420798/

Tax Payer Money going to research not available to continue science = Flawed Policy

How can an article about this not mention Aaron Schwartz?

20
RUG3Y 1 hour ago 0 replies      
This is the age of the internet. We have no need for these publishing middlemen. Knowledge like this should be shared freely for the public good.
21
bluerobotcat 2 hours ago 0 replies      
I was expecting the secret codeword to be 'preprint'. When I was in academia not too long ago, I would often ask authors for the preprint of this or that paper, and they'd usually send it back promptly.
22
Kristine1975 6 hours ago 0 replies      
The hashtag seems to have originated in 2012: http://www.altmetric.com/blog/interactions-the-numbers-behin...
23
teekert 4 hours ago 4 replies      
I also don't like it but the paper needs be printed and reviewed. This is not free. Perhaps we should agree that the publishing group pays for the entire cost of the article so that it can be free after the process of publishing it? Or boycot paywalled publishers, maybe go for PLOS? If you have ever complained about paywalls, don't ever publish in a paywalled journal yourself.

I'm all for free papers by the way, nothing is more annoying that researching things and hitting paywalls but someone has got to pay the people doing the publishing work.

Also: If I order a paper from our library or I download it myself, it often comes with an on the fly generated cover page with my IP address on it. One can remove that, certainly but there may be other mechanisms to tag papers. Amazon reportedly investigated (and implemented?) putting specific, unique errors in DRM free ebook copies to identify sources of piracy. So I wouldn't advice you to just send the PDF around unless you are the author maybe and have a PDF that did not go through the publishing process.

Still loving the initiative though ;)

24
schoen 5 hours ago 0 replies      
It's funny to see something of the outside view. I suspect people reading this on HN are much more likely to understand "I can haz", as well as easily relate to the scientists' point of view.
25
c3534l 1 hour ago 0 replies      
Aww. Scientists are adorable.
26
raverbashing 6 hours ago 0 replies      
Good!

Even better, publish your articles 'for free'

Overcomplexifying, Underdelivering ieee.org
19 points by imdsm  6 hours ago   2 comments top 2
1
fnordsensei 3 minutes ago 0 replies      
Overcomplecting*

Some make serious money out of scope creep in IT projects. Sometimes I think that this must be how the major consultancies are sustaining themselves.

Someone wrote that the /average/ IT project ran 10x over budget. This would be completely unacceptable in any domain but IT.

2
Roboprog 16 minutes ago 0 replies      
How does one hold the requirements for "700 systems" in one's head?

The only way that could possibly work is if the original systems are grossly repetitive and unoriginal (variations on 2 themes, perhaps?).

The Right Thing? dadgum.com
41 points by Scramblejams  2 hours ago   9 comments top 7
1
jasode 1 hour ago 1 reply      
>Also the code is faster, likely because it doesn't have to load another module at runtime.

Sure, but common sense would guide programmers about such tradeoffs. The extra time spent loading an additional library dependency would be amortized over the total execution time of the program -- IF -- the program makes repeated use of the library.

If a js file only has a single getElementById() call, substituting that with a JQuery library to get the "$" syntax would be overkill.[1] Howevever, if the javascript has many complicated DOM selects and bunch of animations, the extra JQuery load time can be justified.

One of the reasons to use a library the author didn't touch on was "insurance against unknown edge cases." For example, I could attempt to write 50 lines of code to uppercase a lowercase Unicode string. However, my attempt would have bugs in it. Instead, it would be more prudent to use the ICU library. It's a hassle to add that dependency and it's many thousands of lines more than my "simple" program but the ICU developers covered more edge cases than I ever thought of.

[1]http://www.doxdesk.com/img/updates/20091116-so-large.gif

2
bluetomcat 1 hour ago 0 replies      
This reminds me of the famous quote by Joe Armstrong -- You wanted a banana but what you got was a gorilla holding the banana and the entire jungle.

Most modern software has incredibly complex dependency chains and that's what makes it fragile and unpredictable most of the time. If we focus on making the languages, runtimes and core libraries flexible enough that we don't need to assemble code from dozens of hobbyist GitHub projects to put up an app with a reasonably modern UI, we would make a huge step forward.

3
zaphar 1 hour ago 0 replies      
Many times the "right" thing is a balance between using a library/module as a black box and implementing something yourself. It usually involves a lot of considerations.

* License is it compatible with the rest of your code?

* Relative size of module vs what you need from it. The size of the module introduces a non-trivial maintenance burden.

* Difficulty of just doing it yourself. Is the thing you need to do non-trival? like say encryption? or Distributed Consensus?

* Is the module compatible with the internals of your system? Will it require significant changes to how your code or application works?

* Platform support? Will the module work on all the platforms your code/application will run on?

All of these things influence the decision. A knee jerk response of "Just use X" may be right but you'll find yourself in the position of not knowing why it's right and thus unable to adjust if it ever stops being right.

4
strommen 47 minutes ago 0 replies      
It's frustrating when a library function does a worse job than 15 lines of code you wrote yourself.

It's doubly frustrating when a library function is obsoleted and replaced wholesale rather than fixed of whatever defects it has.

And file I/O is triply frustrating because there are all sorts of corner cases, plus leaked abstractions from the file system and hardware.

The moral of the story is that you should never use libraries, and never do any reading or writing of files. Wait, what?

5
Eclyps 1 hour ago 1 reply      
I often find myself feeling the same way about using gems in ruby. It's amazing how quickly I can get something extremely functional built out by adding a handful of gems, but after a few months I look back and realize how much extra (invisible) code I've added when I really just needed a tiny sliver of the functionality provided.
6
nickpsecurity 34 minutes ago 0 replies      
Richard Gabriel....

https://www.jwz.org/doc/worse-is-better.html

...would've said that module embodied Worse is Better philosophy of something that snowballed into a mess with uptake and feature creep. Just guessing: haven't read its source or anything.

Concise and correctness just often go hand in hand. Your approach is likely closer to The Right Thing. ;)

Top curling teams say they won't use high-tech brooms cbc.ca
27 points by mhb  2 hours ago   16 comments top 5
1
jccalhoun 1 hour ago 1 reply      
There are a couple things going on with this story:

One is the role of technology in sports which is really interesting. In other sports there is a lot of debate over what technology is allowed and what isn't. I would be really interested to see some things like how far someone could hit a golf ball if there were no restrictions on the club or ball or how fast a person could swim if there were no restrictions on the swimsuits.

The second thing, however, is that the text of the story and the video have different focuses. The text focuses on telling the story of how this is a grassroots movement by some athletes. The video, however, seems to have a more pronounced undercurrent that this might really be about one company, BalancePlus, trying to put pressure against an upstart competitor, icePad, who is eating into their market share. I think it is really interesting that the text doesn't emphasize this as much as the video does.

2
brudgers 1 hour ago 0 replies      
This seems a bit like wooden tennis rackets a few decades ago, though with a less mainstream sport the rules body may have more pull than manufacturers. But the bottom line is that sports change over time or die. Changes simply mean that a different set of skills come to the fore and others become less valuable.
3
pnut 2 hours ago 1 reply      
If it's allowed in the regulations, and they want to win, they will use it, end of story.
4
amelius 2 hours ago 2 replies      
> We don't want the teams with the best technology and whoever sponsors who to win

I applaud this, but I guess ultimately, either directly or indirectly, the sponsors still decide who is on a team. Or in other words, teams with more money will still be better teams.

5
rpledge 2 hours ago 1 reply      
So they're going to go back to corn husk brooms? Where do they draw the line?
Microsoft Surface Pro 4 Review anandtech.com
94 points by fumar  2 hours ago   43 comments top 13
1
lewisl9029 8 minutes ago 0 replies      
Personally, I actually prefer the 16:9 aspect ratio for any screen that has sufficient raw vertical resolution to not hinder productivity.

I feel that once a device gets past the point where the lack of vertical resolution limits productivity, the marginal utility offered by even more vertical resolution becomes rather insignificant, to the point where I'd probably benefit more from having more horizontal resolution for things like snapping windows side-by-side and not having black bars on 16:9 video.

2
fumar 1 hour ago 2 replies      
I have a Surface Pro 3, and owned a Pro 1 (1st gen), I have to applaud Msoft for iterating quickly on their hardware line. Its been ~3 years and the (personal experience) general consumer approval of the Surface line has been gradually changing. It went from an iPad competitor, which was a terrible comparison, to a generating its own category, a tablet that can replace your laptop.

I typically skip a generation to upgrade machines, but the Pro 4 (based on this review) solves all the small quips I had with mine. Its looking like I am going to upgrade to the Surface Pro with Iris graphics. Still on the fence about the Book, I don't really need the laptopness.

3
bkjelden 1 hour ago 2 replies      
To me, it's almost as much of a reflection on Microsoft's OEMs as it is on Microsoft that, in 3 years, Microsoft has iterated from it's first device, which was a commercial flop, to a tablet/ultrabook that's only real criticism is the lack of a USB Type-C port.

Where would Microsoft be today if they had given up on their OEMs 5 years sooner, and gone head to head with Apple on hardware?

4
guelo 1 hour ago 1 reply      
This thing looks nice but I think the Surface Book is going to be the breakout device for Microsoft in this generation.
5
nogridbag 1 hour ago 1 reply      
They also posted a Surface Book "First Look" today with a couple of GPU benchmarks:

http://www.anandtech.com/show/9732/the-microsoft-surface-boo...

6
smrtinsert 1 hour ago 1 reply      
I can't stand the material of the soft plastic keyboard case. It drives me batty.
7
kozukumi 2 hours ago 0 replies      
Hmm the NAND write is a little low. Not a huge deal but a little surprising for a machine of this spec/price.
8
swozey 1 hour ago 3 replies      
The Yoga 900 and this should be an interesting competition. I think I'll definitely grab one of them this year.
9
dman 2 hours ago 0 replies      
Does anyone have information about battery life of core m3 vs I5 vs I7 in sp4?
10
Artistry121 1 hour ago 3 replies      
How does this compare against the Macbook?
11
JTon 2 hours ago 1 reply      
Was desperately looking for a more direct comparison between the m3 and i5. I can't decide!
12
jbssm 26 minutes ago 1 reply      
Strangely my workflow is in a place where most of my graphic programs also exist on Windows (Lightroom mostly, everything else is easily changeable).

It's my terminal workflow that I can only use on a Mac/Linux.Zsh, Vim, Tmux, LateX, Python and mostly a package manager (Apt on Linux, Homebrew on the Mac). Most of which I can just configure in a new system pulling the config files from my Github repository in less than 15 minutes after a fresh install.

That's what's mainly holding me from going back to windows (and the Application update process which is still awful as I can see from my VMware installation of Windows 10... seriously, updating the various components of Visual Studio in a semi manual way is just ridiculous), is there any good alternative for the Shell in windows that doesn't involve considerable tinking?

13
skrowl 1 hour ago 1 reply      
Unfortunately, his review sample only had the weak Intel integrated GPU. I can't wait to see some numbers from the models with the nVidia GPU.
Hilbert's paradox of the Grand Hotel wikipedia.org
11 points by vinnyglennon  4 hours ago   2 comments top 2
1
mfoy_ 8 minutes ago 0 replies      
>"If an infinite set can be put into one-to-one correspondence with the natural numbers (N) it is called a countable set. Otherwise it is uncountable."[1]

This paradox hinges on the strange notion of cardinality of infinite sets. Specifically, the set of all even integers, the set of all odd integers, and the set of all integers(!) have the same cardinality, and therefore the same "size".

---

[1]http://www.math.ups.edu/~bryans/Current/Journal_Spring_1999/...

2
GhotiFish 7 minutes ago 0 replies      
I like this paradox for it's simplicity, but there's just one aspect that cracked me up.

>Suppose the hotel is next to an ocean, and an infinite number of aircraft carriers arrive, each bearing an infinite number of coaches, each with an infinite number of passengers.

hahaha.

How would we extend that?

suppose we have an infinite number of passengers, carried by an infinite number of coaches, transported by an infinite number of aircraft carriers, shoved in by an infinite number of tsunamis, which occur on an infinite number of continents, on an infinite number of Dyson spheres...

The Scientific Basis of Cryonics technologyreview.com
12 points by apsec112  3 hours ago   3 comments top 2
1
cryoshon 18 minutes ago 1 reply      
Yeah, I freeze and thaw cells constantly without much trouble--just plop those suckers in a solution of 10% DMSO and a serum-containing isotonic liquid then throw into the freeze-assist device, then put that into the -80C freezer and transfer it to the liquid nitrogen freezer the next morning. Everyone does it this way, and has for years. I've always wondered if the freezing and thawing process subtly changes cell functioning (beyond the time given to cells to rest after being thawed), but I'm guessing if it were the case someone would have noticed it by now.

The bugger that haunts human cryonics is that thawing is never perfect because the cryoprotectants used to prevent ice-crystals within the cells are usually toxic. If you freeze cells that are measured to be 100% viable/alive at the time (very common) then thaw them using best practices, you're going to have some cell death-- maybe 1-5% if you're fast (less time spent in toxic cryoprotectant) and lucky. If you're unlucky or slow, you can look at 25-45% of your originally healthy cells being dead upon completion of the thawing process. The remaining cells are usually extremely discombobulated, and can take days to return to their baseline. This is completely fine if you're tooling around in a research lab or industrial lab, but even a 1% loss is probably too much for a human brain to bear and remain the same as before.

I suppose that if you work under the assumption that the future technology cryonics relies on for thawing will exist, cell loss during thaw will not be a problem; I find this possibility to be fairly likely over a long time span. Alternatively, you could assume that there will be advanced ways of restoring brain function or generating fresh neurons after systemic damage-- quite a stretch if you ask me, but it's conceivable. I think that ultimately the goals of cryonics will be scientifically realizable for those who were most recently preserved.

2
reasonattlm 21 minutes ago 0 replies      
See also the scientists' open letter on cryonics, which is old enough to have had many homes online over the years, but is now here:

http://www.evidencebasedcryonics.org/scientists-open-letter-...

You might also look at the Alcor FAQ for scientists:

http://www.alcor.org/sciencefaq.htm

Some of the ongoing information generated by the Brain Preservation Foundation's technology prize competition is also interesting.

http://www.brainpreservation.org/competitors/

The perspective of the BPF folk is perhaps a useful calibration point for those coming into this as a new topic; they are critical of cryonics for some detailed technical reasons, with plenty of room for debate, think that plastination should be developed as an alternative technology, but are firm supporters of the concepts of brain preservation and the evidence to date for fine structure preservation. For example, see this response to an earlier and very shoddy article critiquing cryonics at the Technology Review:

http://www.brainpreservation.org/ken-hayworths-personal-resp...

Requestdiff Send two HTTP requests and visualize any differences requestdiff.com
121 points by victordg  7 hours ago   31 comments top 16
1
jalfresi 6 hours ago 2 replies      
For command-line junkies:

diff <(curl -sS -L https://httpbin.org/get) <(curl -sS -L https://httpbin.org/get?show_env=1)

2
pixelbeat 23 minutes ago 0 replies      
Personally I use these from the command line:

http://www.pixelbeat.org/scripts/urldiffhttp://www.pixelbeat.org/scripts/idiff

See also mergely which supports diffing URLs:http://pixelbeat/programming/diffs/#mergely

3
therein 6 hours ago 2 replies      
Good work but frankly this would be useful if it was:

(1) Terminal based

(2) Supported other types of HTTP requests

(3) Supported request body

(4) Allowed editing request headers

(5) Wasn't so easily exploitable to be used as a proxy or a DDoS relay (server-side bummer):

http://requestdiff.com/proxy?url1=https://www.amazon.com&url...

4
diggan 6 hours ago 1 reply      
This is very cool! Looking forward to it being open source (also, things from Barcelona rocks ;) )

For people wanting to have a CLI tool instead, John Graham-Cumming's httpdiff might be worth looking at. https://github.com/jgrahamc/httpdiff

5
smarx 7 hours ago 1 reply      
Seems like a useful tool; thanks for sharing it!

Two questions:

1) What's the diff logic? At first glance, it looks like JSON is reformatted (maybe canonicalized in some way) and then a line-by-line diff is applied. Is there more to it? Since the tool seems JSON-aware, I was surprised to see an added trailing comma up as a difference.

2) Do you have plans to expand the kind of HTTP requests users can make? It would be nice to use different verbs, headers, and request bodies. Runscope has a similar tool[0] built in that I believe (haven't tried it yet) allows a bit more flexibility, but it would be nice to have a standalone tool available.

[0] http://blog.runscope.com/posts/comparing-http-requests-using...

6
krakensden 2 hours ago 0 replies      
Another solution: https://github.com/xthexder/httptee

A daemon that sends to two backends, and diffs the results.

7
SimeVidas 16 minutes ago 0 replies      
Use cases?
8
no_gravity 4 hours ago 0 replies      
Comparing URLs comes up from time to time here on Hacker News and at one point inspired me to write a tool that batch compares pages visually:

http://www.productchart.com/blog/2015-07-19-urldiff

It renders a set of pages in a headless browser, compares them visually and alerts you if something changed.

Just a few lines of bash as you can see. But it turned out to be pretty useful. UrlDiff is a regular part of our regression testing at Product Chart now.

9
kozhevnikov 6 hours ago 0 replies      
Reminds me of Diffy, a tool Twitter is using to algorithmically diff test their HTTP-based services.

https://blog.twitter.com/2015/diffy-testing-services-without...

10
dannybtran 4 hours ago 0 replies      
Not working on Safari Version 8.0.6 (10600.6.3) Mac OS X 10.10.3

From JS Console:

[Error] TypeError: undefined is not a function (evaluating 'Array.from(e)')_toConsumableArray2 (app.min.js.pagespeed.ce.ozGaCBt6Kj.js, line 1)s (app.min.js.pagespeed.ce.ozGaCBt6Kj.js, line 1)f (app.min.js.pagespeed.ce.ozGaCBt6Kj.js, line 1)onload (app.min.js.pagespeed.ce.ozGaCBt6Kj.js, line 1)

11
skastel 3 hours ago 0 replies      
If you're looking for something similar, but with more features that natively understand HTTP, check out www.runscope.com. Full disclosure, I'm an employee and proud of what we've built!
12
andybak 5 hours ago 0 replies      
Wonderful.

Just the other day I needed something similar and was disappointed that I could find it.

I wanted to discuss something with a remote colleague and to illustrate it I wanted a visual diff of two files. I was hoping there was a nice little web app offering this but I was forced to screenshare (I could have terminal-shared but it was more hassle).

I was hoping for something like Etherpad but with a live visual diff.

13
vortico 5 hours ago 0 replies      
It's pretty easy to write a recursive diff function that compares JSON strings, in order to avoid the JSON -> diff by line hack that you're doing. But it's a clever hack that easily translates to the command-line.
14
zwischenzug 6 hours ago 1 reply      
This would be useful behind a firewall. Is the source available?
15
amelius 5 hours ago 0 replies      
I'd like to have a bookmarklet that lets me compare the states of the DOM trees of a website taken at two different times.
16
blemasle 5 hours ago 2 replies      
Requests are made server side, which make this unusable for local/dev environnement purpose. Too bad !
Cell (microprocessor) wikipedia.org
13 points by dmmalam  5 hours ago   1 comment top
1
protomyth 1 minute ago 0 replies      
"The Race for a New Game Machine"[1] is a pretty good read on the development of the Cell and Xenon[2] processors. Giving up the out-of-order seems like a really bad decision and the internal fighting at IBM, if accurate, is really sad.

A side question: What in the POWER architecture makes it hard to implement? I was told the addressing modes are complicated enough that it will always be slower and harder to create than other processors. I'm wondering if this is urban myth or has some basis in reality?

1) http://www.amazon.com/The-Race-New-Game-Machine/dp/080653101... with a lot of articles written about the book such as http://www.wsj.com/articles/SB123069467545545011

2) https://en.wikipedia.org/wiki/Xenon_(processor)

Show HN: WebRPC a simple alternative to REST and SOAP github.com
12 points by gk_brown  2 hours ago   5 comments top 3
1
captn3m0 48 minutes ago 1 reply      
While this may be simpler than REST and cleaner than SOAP, I find REST to be far more elegant.

>An HTTP 200 is returned on successful completion, and HTTP 500 is returned in the case of an error (i.e. an exception). Note that exceptions are intended to represent unexpected failures, not application-specific errors. No other HTTP status codes are supported.

Now you are just using HTTP as your transport layer, you can very well make it customised to your particular needs rather than defining a spec. This might result in easier client code, but why not use Thrift if that is what you need.

2
leejoramo 47 minutes ago 1 reply      
So how does this compare to XML-RPC?

I see that it is using JSON instead of XML.

> Support currently exists for implementing web RPC services in Java, and consuming services in Java, Objective-C/Swift, or JavaScript.

While I may personally feel that JSON is a better format than XML, there are a implementations of XML-RPC for almost all languages and platforms which is a huge advantage.

3
iambvk 40 minutes ago 0 replies      
Does it open a socket for every rpc?
A Former Twitch Employee Has One of the Most Reproduced Faces fivethirtyeight.com
7 points by JacobAldridge  3 hours ago   discuss
Jepsen: Distributed Systems Safety Analysis jepsen.io
265 points by luu  12 hours ago   55 comments top 11
1
willchen 11 hours ago 1 reply      
I'd be very interested to see RethinkDB analyzed, particularly with the 2.1 release promising high availability through Raft. RethinkDB and Aphyr have talked about doing Jepsen tests for it, but I'm not sure where that's landed (https://github.com/rethinkdb/rethinkdb/issues/1493).
2
OMGWTF 10 hours ago 2 replies      
> Flash plugin missing

> Get the latest Flash player to view this content

No, I won't. This scares me too much:https://www.cvedetails.com/vulnerability-list/vendor_id-53/p...

3
striking 11 hours ago 2 replies      
A lot of people are noting their interest in seeing a certain database tested. Sorry to say so, but:

>> Can you test X next?

> Tests take about a month. I do take suggestions into consideration, but I can't promise you anything. Backlog is a few years long at this point.

(from https://aphyr.com/about)

4
felixgallo 32 minutes ago 0 replies      
On a human note: Kyle is brave as fuck for stepping out into the black unknown yawning abyss to try to do this under his own banner. Bravo, Kyle, and may you find incredible success.
5
JoelJacobson 11 hours ago 3 replies      
Why isn't PostgreSQL in the list on the first page? It is in the [blog posts] page though.

I would love to see the multixact data corruption problems introduced in 9.3 analyzed, and see if he can verify them to be solved in the latest version.

6
headcanon 11 hours ago 2 replies      
I wonder if Carly Rae Jepsen is aware of the legacy of her song in the software world...
7
eddd 5 hours ago 1 reply      
I really dig the Riak test. One of the best comparison of LWW and CRDT. Also very good conclusion, tl; dr: If you don't use idempotent writes, you are going to have bad time.
8
jhasse 11 hours ago 1 reply      
Please make the title more verbose.
9
doomrobo 9 hours ago 2 replies      
Was Jespen developed at Stripe? If so, how can aphyr get the rights to use Jespen for individual contracting work?
10
MCRed 11 hours ago 1 reply      
I really would like to see Couchbase analyzed... and for aphyr to marry me.
11
votemedown 11 hours ago 0 replies      
The only thing that passes Jespen: vim.
Khan Academy's React style guide github.com
92 points by mdp  8 hours ago   48 comments top 13
1
unoti 4 minutes ago 1 reply      
I've been working on learning React, and finding it particularly difficult. It appears there are a large number of things I need to have in place and pieces of knowledge I need before I can use it. These include some kind of Js compilation step like Webpack or Browserfy, something like Babel, a knowledge of how to use ES6, an understanding of React, and an understanding of how to use React-Router.

Although I've done some Javascript on the front end, I haven't done the other things I mentioned. The tutorials all seem to assume I know how to do everything but one little piece of detail, and I'm finding it difficult to bite on the elephant. It's hard to tell where to start on learning this stuff, and how much I need to learn before I can use it.

Any suggestions for what resources and approach to use to learn react? My goal eventually is an app that runs in 3 versions: web, iOS, android. I don't intend to use javascript on the server.

2
andreasklinger 1 hour ago 4 replies      
Request for comment:

Linters are by far powerful enough by now.

Can we (as community) switch to documenting style guides as linter rulesets + custom linters. Eg for javascript: eslint and jscs

Written style guides are good for understanding why - but linters actually help others to adapt to it quicker

3
liquidise 44 minutes ago 0 replies      
A number of these guidelines reinforce my biggest complaint with React: it is architecturally difficult to avoid monolithic view files.

In a traditional web app, we have 4 layers: client views, client app, server app, database. React, described as a strict view layer, in reality is being used as much more. At this point, it is not just consuming the client app, but is also taking nibbles at the server app as well.

To each their own of course, but i would ask people to hesitate about these decisions. The architectural issues with monolithic views is well known, and just because we have a shiny new tool does not mean we should throw that understanding by the wayside.

Source: i work full-time on a React and Backbone app

4
iamjs 1 hour ago 0 replies      
It's probably a good time to start looking at using the Fetch API [1] for making AJAX requests instead of using jQuery or Backbone (or even XMLHttpRequest). Support seems to be growing quickly and Github's polyfill [2] can help cover the gaps.

[1] https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API[2] https://github.com/github/fetch

5
pramodliv1 3 hours ago 4 replies      
A couple of questions:

>Do not use Backbone models

I use Backbone models for ajax since it makes decisions such as PUT vs POST and model.save() looks cleaner than $.ajax. Also, Backbone collections provide a declarative way to handle sorting and duplicate models. But these models are internal to the Store and not exposed to the views. I'm still a React newbie. Is this a valid reason to continue using Backbone?

2. It seems as though Khan Academy do not use React for SVG elements in their interactive exercises. For example, https://www.khanacademy.org/math/geometry/transformations/hs... Do you plan to migrate SVG to React?

6
aaronkrolik 1 hour ago 0 replies      
Style question re dom manipulation: third party embeds, such as twitter, instagram often come as specially classed blockquote elements that are swapped with iframes by a jquery plugin. What is the best way to integrate this with react?
7
traviswingo 53 minutes ago 0 replies      
I find it funny seeing this section:

https://github.com/Khan/style-guides/blob/master/style/react...

Even though John Resig is one of their main devs.

8
tzury 4 hours ago 3 replies      
John Resig (jQuery creator) is one of the chief architects at there, and yet you see, this line:

 Never use jQuery for DOM manipulation
Nice.

9
NoCulturalFit 2 hours ago 2 replies      
I would love to read more about styling inline and completely remove external CSS files.

Are there any CSS-frameworks that have been converted to JS but not are not their own components yet? It's easy to find React-Bootstrap but that comes with ready made components, I am looking for styling that's purely in JS so I can make my own components.

Also would a route-component be considered logic or presentation, or maybe it is its own thing and they forgot to mention it?

10
twsted 3 hours ago 2 replies      
"Do not use Backbone models."

This seems a little strong. What is the reason for this guideline? I know of many projects that are combining the use of React with Backbone.

11
codingdave 3 hours ago 1 reply      
"Fit them all on the same line if you can." (HTML Properties) -- OK, easy enough, my editor has no limit on its line length. I can always fit them on one line. Or did they expect it to fit on one line visually. If so, at what width to they expect my editor window to be?

In all seriousness, though, I appreciate the brevity of this guide. It can be quickly read and understood, and is not the fully-fledged book I've seen from other places.

12
amelius 4 hours ago 4 replies      
It makes me feel so sad that the "state of the art" in front-end development is apparently rerunning your render code on every little update. Yes, I know there are shortcuts to making this more efficient, but in essence the technique remains inelegant in the sense that it does not extend well to other parts of the code (that runs computations that might also partially, and not fully, change on an update).
13
lalwanivikas 1 hour ago 0 replies      
React and Khan Academy reminded me of this funny tweet by one of their developers: https://twitter.com/jdan/status/655432901782302720
The rise of mixed precision arithmetic nickhigham.wordpress.com
29 points by johndcook  4 hours ago   5 comments top
1
atemerev 51 minutes ago 3 replies      
Still, nearly no progress in fast decimals, which are extremely important in financial applications.

I'd even say that the only place where floating point is necessary is in simulations (physics, 3D, analog signals all of it should be properly done with GPUs.) Everything else (2D layouts, finance, data processing) is better served with either rationals or decimals.

We should remove floating point support from general-purpose CPUs and leave it to GPUs, where it belongs.

How the war on drugs creates violence washingtonpost.com
103 points by flannery  8 hours ago   79 comments top 10
1
cubano 2 hours ago 7 replies      
I just don't think many people realize just how much legal corporate and government "business" is created by the insane war on drugs, and I am convinced this is the reason why it continues.

Like I mentioned before, I just spent 6 months in the deep bowels of the criminal justice system for drug possession here in Florida, and those observations have shaped this view.

The cops, the judges, the lawyers on both sides (who eventually become the politicians that make the laws), the clerks, the guards (do they hate being called that!), the jail/prison administrators...they ALL are making pretty awesome livings from the war on drugs, and have zero repeat ZERO incentive to change anything.

The problem is not a legal one, it's economic.

The public is fed the propaganda of wrecked lives and violence to keep the status quo...until the population somehow wakes up and sees how the CJS is totally broken as perhaps even more corrupt then the drug game, I doubt anything can change.

2
kawa 5 hours ago 2 replies      
There's only so much profit in the drug business because of the illegality of many drugs. Changing that would kill the profits of many very rich and powerful criminals and cartels. So it's very understandable that those people do everything they can to prevent their business model from collapsing.

The usual way of doing it is simply lots of lobbying: Paying "well meaning" people, journalists and politicians to stay on the track of keeping most drugs illegal, so that the drug lords and cartels can continue to earn money.

So does really anybody wonder why the (obviously totally pointless) "war on drugs" is still waged and will probably waged for quite a while?

3
sixQuarks 1 hour ago 2 replies      
Besides all of the stated reasons to legalize/decriminalize, I think just as important is personal liberty.

I should be free to experiment with my own consciousness (so long as it does not impede on the rights of others) - how much more personal can you get? For the government to impede on this is unconscionable (pun intended).

4
awjr 7 hours ago 2 replies      
I've pretty much come down on the side of prohibition is bad. Yes hard drugs destroy lives, but many many many more are destroyed by socially acceptable drugs like alcohol and nicotine.

We live within societies where the way we control substances creates a huge social and financial cost.

We'd be better off legalising every drug and creating a social framework within which these drugs can be taxed, consumed safely, and users supported.

Case in point is the legalisation of cannabis which I hope is a stepping stone to legalisation of all recreational drugs.

5
fnordfnordfnord 14 minutes ago 0 replies      
I'm not too keen on Silver's analysis/conclusion that drug offenses account for such a low fraction of inmates' crimes. Drug possession renders a lot of otherwise non criminal behavior into a "violent felony", and prosecutors love tacking these extra charges on when they can.
6
aidos 4 hours ago 0 replies      
The other day Richard Branson posted a leaked report from the UN Office on Drugs and Crime that agrees with the notion of drug use as a health issue, not a criminal one. Of course, they've turned around and said that's not their official stance. Sigh.

http://www.theguardian.com/society/2015/oct/19/un-call-decri...

http://www.virgin.com/richard-branson/finally-a-change-in-co...

7
suneilp 7 hours ago 1 reply      
Is it creating violence or exacerbating violence? I'm more inclined to choose the latter. The reason why I distinguish the two is that I see the continuation and increase in drug use as something symptomatic of other issues in the big picture.

Economic issues, politics, racism.... oppression on various fronts from the system down to the family to the self.

Drugs (and alcohol) are an escape. Normal activities also provide an escape when people get obsessed with them. TV, video games, food (my escape), exercise for some addicts, sex, etc.

Just legalize cannabis and decriminalize other drugs already.

8
upofadown 3 hours ago 0 replies      
If you have a business disagreement you can't go to drug court to resolve it. Sometimes attempts to resolve differences go wrong. At some level the violence in the drug black market is simply because it is a black market.
9
DanielBMarkham 5 hours ago 1 reply      
I'm a libertarian, so I'm already sold on legalization. Count me in.

As such, hopefully I can quibble with the text a bit.

"Ceasing this hypocritical practice by releasing nonviolent offenders is morally urgent."

Yeah, not so much. Yes, there is a severe moral problem here, but please do not make moral arguments! It's folks with moral arguments riding around on high horses that got us into this mess. Instead, argue from the standpoint of practicality (which she does).

One of the practicality arguments she does not make, which deserves mentioning, is that because the drug war is unwinnable, there are too many laws. This makes folks with the power of selective enforcement lords over the rest of us.

Have a traffic stop? Cops ask to search your car? You have a right to say "no". But if you do, be prepared to wait around until the drug dog shows up. He'll sniff around your car and "alert" the cops, even if there's no drugs present. Then, guess what? They get to tear apart your car while you watch. All because of the war on drugs.

Let's say you are a drug user. You have a joint in the ashtray. In this case, it gets even better. Then -- if I'm not mistaken -- they get to take your car! A few dollars worth of illegal pot, which might not even be yours, and you could lose tens of thousands of dollars worth of car.

It's not that this is morally outrageous. It certainly is. It's that a system of justice cannot maintain the consent of the governed when it turns LE officers into something approaching highway bandits. Selective enforcement of drug laws -- both by cops and prosecutors -- distorts the legal system so much as to make it unworkable. Sure, it's bad, but the bigger point is that it cannot continue working in this fashion. Something's gotta give.

I liked the article. It's good to see public discourse slowly become much more reasonable about drug addiction and its consequences. One caution, though: in my opinion what we need to do is still stay tough on violent, hardened criminals while being more pragmatic about drug crimes. Otherwise we'll end up being slandered as soft-headed and irrational.

10
peterwwillis 2 hours ago 0 replies      
Violent crime in America rose dramatically as a result of worsening racial ties after the death of Martin Luther King, a lack of employment and investment in local communities, and an increase in use of heroin and (later) easier to produce drugs like crack.

Increase in the number of robbery and petty crime identically track both the increase in use of and violence associated with drug use. But the war on drugs' main influence on this violence has been to keep the pressure on the drug dealers, increasing risk to sell the product and making it less available, thus driving up prices, increasing competition, and therefore promoting violence between drug dealers, and by drug users in order to afford what they're addicted to.

And all of this leads to increased incarceration, not just from drug charges, but from the increased violence associated with the drug trade, gang warfare, and an unlawful under-society where people do whatever they can to get by.

A flow chart would make it a bit easier to grok, but basically the drug war throws fuel on a fire that was only simmering before.

Show HN: Delta view your Git split diffs in the browser octavore.com
6 points by tpwong  2 hours ago   discuss
Humanize the Craft of Building Interactive Computer Applications [pdf] melconway.com
26 points by adarshaj  4 hours ago   4 comments top 3
1
dexen 2 hours ago 1 reply      
I am amazed by how close LISP, or at least some dialects, comes to what the OP asks for:

* REPL

* code/data duality

* direct manipulation of the program

* program always running and you edit the VM image, instead of editing sources and starting the program again

* ``[t]he application s user interface and a view of the application s structuresit side-by-side in front of the developer/user''

* ``[t]here is only one form of the application and there is no translator.''

2
riskable 3 hours ago 0 replies      
I have a new law to go along with Conway's: Organizations or individuals that produce content... are constrained to distribute said content in formats that mirror the tools used to create it.

For example, a person using the Pages application on a Mac might be inclined to distribute their writings via PDF using Mac-typical fonts utilizing Apple-inspired layout and formatting options. You know, instead of just making a web page or using a web publishing tool.

Despite the annoying format I did read the PDF and while I like the sentiment I can't help but think it's a lot of wishful thinking. People have been trying to make visual programming tools (i.e. tools with real-time positive and negative brain feedback loops) for a very long time now and they always come up short.

Viewing the labor of programming in real-time makes sense though and I think we can get there for a lot of use cases. Taking advantage of the web and interpreted languages (e.g. Python) or instant-compiling languages (e.g. Go) is probably how we'll do it.

The opposite of progress in this area would be utilizing extremely verbose languages like Java or C# or languages that require a lot of preprocessing and/or compiling and/or complicated build and deployment processes. Java has got to be the worst here with slow startup times, complicated and (often extremely time-consuming) deployment processes... And that's just for the IDE! Haha

3
treerock 3 hours ago 0 replies      
I'd love to see the prototype he mentions. I struggle to even conceive how something like this could work.
       cached 21 October 2015 16:02:04 GMT