hacker news with inline top comments    .. more ..    1 Jan 2015 News
home   ask   best   4 years ago   
1
India Orders 32 Websites Blocked, Including GitHub, Archive.Org, Pastebin
572 points by peter123  11 hours ago   232 comments top 56
1
bbarn 11 hours ago 6 replies      
Arvind Gupta, the head of IT Cell, BJP Tweeted: "The websites that have been blocked were based on an advisory by Anti Terrorism Squad, and were carrying Anti India content from ISIS. The sites that have removed objectionable content and/or cooperated with the on going investigations, are being unblocked."

This is bad. archive.org by default should have all sorts of offensive things on it. Pastebin and github should not be responsible for people hosting code they don't like. May as well block google too, I'm pretty sure you can find pro-ISIS sites on there as well.

2
revelation 10 hours ago 3 replies      
I love Indian government statements, they are always so transparently incompetent, inane and corrupt, as are the accompanying actions, like blocking a random PHP project on sourceforge.

I guess we'll just never know why they do these stupid things. By the time some bureaucrat has to give a statement all they can get out is terror, ISIS and anti India.

3
kartikkumar 10 hours ago 3 replies      
Ah that explains it! Haven't been able to push to Github ever since stepping off the plane in Mumbai. Strange thing is that the website works off-and-on. Pushing results in connection refused though. There's also been ZERO information/news provided by the ISP (MTNL) in this regard.

Absolute shame that a blanket ban like this is applied. It has a profound effect on everyday activities unrelated to the original reason for banning. Even if there is content of a questionable nature, it's absolutely crazy to not expose this. Let people make up their own minds about what is right or wrong. A simple ban on these websites isn't going to stop those who mean harm from getting to their goal.

All I can see that this results in is collateral damage, e.g., me not being able to push the latest commits for a research tool I'm building. I might be small fish, but that's the exact point; a ban like this necessary works like a cluster bomb.

4
rajbot 17 minutes ago 0 replies      
I am trying to find out if archive.org is still being blocked or not, as we are hearing conflicting reports from users.

I queried 1844 Indian DNS servers on this list that are marked 'valid' or 'new': http://public-dns.tk/nameserver/in.html

Only one of them (182.59.1.235, operated by MTNL ISP) returned the fake IP being used for blocked sites (59.185.3.14)

I am trying to figure out if that means archive.org has been removed from the block list, or if the DNS servers listed on that page haven't yet been updated with the blocked sites.

If anyone can help us figure out if archive.org is still blocked, it would be greatly appreciated!

Thanks! -raj at archive.org

5
cryptbin 10 hours ago 1 reply      
I run Cryptbin.com, one of the sites on the banned list. As a result of the ban we have seen our traffic surge, with roughly a 1000% increase in traffic from India today alone.

Interesting to note is that we also own the domain cryptb.in (a TLD from India) and that has not been banned. However, it is merely a redirect so it does not provide an alternative entrance to the site. We use it only for short URL's on public pastes.

6
FrankenPC 8 hours ago 3 replies      
I'm not a India software industry expert, so excuse me if this sounds like a dumb question.

Isn't it a terrible economic idea for India to block access to Github? I thought India was really big into software engineering?

7
gshrikant 10 hours ago 1 reply      
> A Government source said the decision to block the 32 websites were taken after thorough filtration process based on a strict regimen, and there is a proper committee in the Department of Information Technology in place to whet complaints. [1]

This is unbelievably ridiculous, if not downright stupid. Even as our Prime Minister speaks of bringing about a new digital revolution, decisions such as these show how badly equipped the lawmakers are in dealing with issues relating to technology.

Blanket bans like these are not only a form of internet censorship which flies in the face of the establishing principles of the largest democracy in the world, the lack of any details or explanation before issuing an outright ban on several important software hosting websites and content providers just evokes an image of a myopic government with incredibly poor understanding of technology.

[1] http://www.thehindu.com/news/national/now-modi-govt-blocks-3...

8
sametmax 10 hours ago 1 reply      
Wow, I'm the autor of 0bin.net (it's an encrypted pastebin written in python). Kinda feel weird to see your (really) small pastebin get caught in that. It's insane.

Well, it's open source and easy to install, anybody can duplicate it if needed so I guess it's ok.

Maybe we should add some way to replicate one instance content to other trusted instances to avoid this problem.

9
bdcravens 11 hours ago 6 replies      
If Github is truly blocked, that could be devastating for outsourced work, on both sides of the equation.
10
sunilnandihalli 7 minutes ago 0 replies      
This is bad!!!! what are they thinking?? github is blocked???
11
sunilnandihalli 8 minutes ago 0 replies      
This is bad!! what are they thinking!!! github is blocked??
12
catchmrbharath 8 hours ago 1 reply      
The sites are blocked only at the DNS level. If you switch your DNS from your ISP to either OPenDNS or Google DNS, then all these sites should work.
13
giis 11 hours ago 0 replies      
I live in Southern part of India, I can access archive.org or github or pastebin from here. I don't think its a complete ban. May be some ISP providers blocking these sites. (Checked in some code to github few minutes back :D ) I hope they will revoke this move in upcoming days.
14
doe88 10 hours ago 0 replies      
I think nowadays blocking Github is almost dumber than blocking coursera.org. It's an invaluable resource for all CS students.
15
aselzer 10 hours ago 0 replies      
This one is interesting: http://sourceforge.net/projects/phorkie/

According to the list it has been specifically blocked. It appears to be a pastebin clone written in PHP.

Almost all of them are pastebin-like sites.

http://atnsoft.com/atnsoft.com/textpaster/ seems very unrelated.

My assumption would have been that a government crawler stumbled upon some messages it didn't like and the sites they were on ended up on the list, but the two sites above would unlikely be affected by this.

Archive.org was probably affected because they mirrored some content of any of them.

16
d0ugie 11 hours ago 1 reply      
A lot of these sites, almost all of them, are pastebin-esque, including sourceforge.net/projects/phorkie/ which is a "PHP and Git based pastebin."

I noticed Pastebin getting a fair number of mentions in the news in connection with the Sony attack as a place for hackers to dump sensitive information publicly and easily.

My guess is this blacklist was assembled to mitigate such hacking damage on Indian targets, but it was assembled with some haste; and github and vimeo, I doubt, will remain blocked for too long.

17
pavanred 5 hours ago 0 replies      
Isn't this setting a dangerous precedent, I am sure such tendency of issuing blanket bans could be misused. Perhaps the cheapest way to attack/bring down a big website or even quell competition, just post some offensive content on a website and let the government issue blanket bans.
18
readme 9 hours ago 1 reply      
Hey I know! IT is the backbone of our economy, so lets block a bunch of sites that programmers like.

--The Indian Gov.

19
scriptle 1 hour ago 0 replies      
So, you're requested to change your DNS setting to "8.8.8.8" & "8.8.4.4" - Google's public DNS. I guess most of the ISPs just blocked the domain names and not the IPs those domains resolve to. It worked for me in two networks.
20
dagwn 1 hour ago 0 replies      
21
sgarg26 10 hours ago 1 reply      
Yes, this move is an over reaction. Especially, banning archive.org... India was just hit with a terrorist attack in Bangalore very recently. The techdirt article should mention this part.

http://www.ndtv.com/article/cheat-sheet/bangalore-bomb-blast...

22
pratnala 1 hour ago 0 replies      
Well, all they have done is just DNS blocking. Change your DNS to 8.8.8.8 and you can continue to access these sites.
23
anExcitedBeast 9 hours ago 1 reply      
Why stop at the domain level? India should block all *.com traffic until Verisign takes down all objectionable content.
24
arielm 4 hours ago 0 replies      
Looks like India is using a cannon to kill a mosquito. Regardless of what we think about terrorism, disabling access to entire sites companies rely on for their daily operation is simply careless.

As noted by other commented they didn't even try to resolve the situation but rather went the route of blindly blocking access.

This is a huge red flag, in my opinion, that the Internet as we know is has reached a big fork in the road and where we go from here will do are if our future will look like Biff's world in Back to the Future II or not...

25
chris_wot 4 hours ago 1 reply      
Oh-ho! Security is evidently high on the Indian government's priority list... the Joint Secretary for the Ministry of Petroleum and Natural Gas, a Mr Shri P. Kalyanasundaram, has an email address at Yahoo for official correspondence! [1]

I'm wondering how long it will be though before the Wikipedia article gets updated:

http://en.wikipedia.org/wiki/Internet_censorship_in_India

1. http://pgportal.gov.in/pgo.aspx

26
IvyMike 10 hours ago 0 replies      
The ominous part is that they blocked sites where you can easily share information in bulk.

Is the intention to prohibit such sharing in general? Such efforts are doomed to fail, but that doesn't mean it won't be a hell of a ride.

27
suhair 11 hours ago 1 reply      
can access github, archive.org, and pastebin through BSNL, Indian State owned telecommunications company. EDIT: Major news paper in India reports it was b/c of contents related to ISIS and the ban was removed later http://www.thehindubusinessline.com/features/smartbuy/tech-n...
28
sayhar 11 hours ago 1 reply      
Thanks Narendra Modi.
29
nnain 8 hours ago 0 replies      
How I wish the Indian top brass gets to read this thread! The new government came with promises of getting rid of bureaucratic hurdles... and now this.

The guy who made this list ofcourse didn't have the guts to put Facebook, Twitter on it because the Prime Minister / PM Office actively uses those tools to reach out to the people.

30
boyter 2 hours ago 0 replies      
Oddly I happen to be in India right now. Can still access every site listed. Not sure if it has not rolled out or perhaps foreign internet has less restrictions.
31
bradleysmith 11 hours ago 1 reply      
This is remedied with a simple VPN, correct?

only justification in that article:

" Arvind Gupta, the head of IT Cell, BJP Tweeted: 'The websites that have been blocked were based on an advisory by Anti Terrorism Squad, and were carrying Anti India content from ISIS. The sites that have removed objectionable content and/or cooperated with the on going investigations, are being unblocked.' "

32
hsivonen 10 hours ago 0 replies      
"However, the key nature of many of the sites affected, and the fact that entire sites, rather than just some of their pages, were blocked, is bound to lead to calls for this blunt instrument to be refined before it is used again."

It would be more worrying if ISPs could block individual pages on https sites like GitHub.

33
confluence 11 hours ago 0 replies      
Someone has made a huge mistake and should be fired over this.

Utterly ludicrous.

34
tkirby 8 hours ago 0 replies      
Don't block stackoverflow. The economy will crash.
35
javajosh 10 hours ago 1 reply      
"Blocking websites" is a degree-of-freedom national governments simply should not have.
36
h43k3r 6 hours ago 0 replies      
I can confirm that many of these websites are still working on my university connection in India.

And I promise, if they don't revert their decision, I will launch a proxy site for all these websites.

37
bigphishy 10 hours ago 0 replies      
What exactly are the consequences of banning github on a fallacious, minuscule and largely anti-technological terrorist organization?
38
ToastyMallows 11 hours ago 3 replies      
Is this just a DNS block or do they control all of the ISPs in India? Sorry I'm ignorant about the state of the Internet in India.
39
musesum 7 hours ago 0 replies      
Have offshored some dev in India, in the past. We depend on Github. VPNs aside, idiotic IT policy by India nudges us towards South America. Same goes (I think) for China in blocking Google Apps.
40
anupshinde 10 hours ago 0 replies      
I can access Github, archive.org, pastebin from state owned BSNL with ISP provided DNS (from western part of India). Have confirmed from few other parts too and it seems to be working. Have been checking since last 6 hours and it hasn't been down yet.
41
mavdi 7 hours ago 0 replies      
Github... Narendra Modi is a total fucking idiot. There goes billions of $$$s of lost income for indian developers with out of date skills.
42
grannyg00se 4 hours ago 0 replies      
Does anyone have an example of ISIS supplied anti-India content?
43
ghantila 9 hours ago 0 replies      
I'm on Airtel Broadband, and all the 32 websites are working fine (atlease for now). May be because I'm using Google Public DNS.
44
Zigurd 3 hours ago 0 replies      
India blocks github for subversion </rimshot>
45
ilamparithi 10 hours ago 0 replies      
Can confirm. Github and archive.org are not accessible. (From South India)
46
AxisOfEval 6 hours ago 0 replies      
The Indian government is being utterly stupid. This is tantamount to saying: "Terrorists breathe Oxygen? Neat! Ban Oxygen."
47
known 9 hours ago 0 replies      
A terrorist is a freedom fighter who isn't on your side.
48
seshakiran 9 hours ago 0 replies      
Github? seriously?
49
byEngineer 3 hours ago 0 replies      
Who cares?
50
skazka16 10 hours ago 0 replies      
Seems like a good time for bitbucket.
51
chocks 10 hours ago 1 reply      
Why block sites like gihub instead of issuing DMCA takedown notices? doesn't make sense.
52
iamleppert 6 hours ago 0 replies      
Good ridden to those in India. Bye bye!!!
53
vegabook 7 hours ago 3 replies      
There is a case to be made for the idea that America-hosted, America-doctrine birthed, websites, do not jive with everybody on the planet. The idea that Github is beyond reproach and blocking it makes no sense is superficially tempting, until you realise that the vast, vast majority of projects on it are America-led. Why is it so surprising to block a culture which is permeating the planet, to the detriment of other cultures? And of course many non-Americans will object to this idea, but that is because they tend to me the top of their local game and looking to be hired into (or otherwise benefit from) the America-led capital-driven, individual first at the expense of the community, ideology.

There is a tendency to knee-jerk condemn these blockages, including those in China, or indeed in Europe. It is not obvious to me that some kind of barrier to the Americanisation of the planet, including via its dominant websites, is such a bad thing.

54
nsnick 8 hours ago 0 replies      
Maybe we can finally get Java off of GitHub.
55
hnroops 6 hours ago 0 replies      
In USA, anyone can quickly write DMCA request and after a few hours the requested website will be offline. So, don't bother with USA. India is much more better for internet startups.
56
vithlani 6 hours ago 1 reply      
LOL... welcome to the third world, with a tinge of fascism.

I would like to see what all the curry apologists on HN come up with to defend this bullshit.

2
Fraud caused disappearance of 99% of Mt. Gox Bitcoins
21 points by jpatokal  1 hour ago   9 comments top 3
1
drcode 47 minutes ago 1 reply      
From all the theories I've seen, the most likely to me is that this was a business that slowly drifted into a whirlpool of fraud and insolvency, probably starting with good intentions.

Most likely, they had a hard time maintaining the right ratio of bitcoin/yen/USD to match the reality of their customer's deposits (through incompetency) and got hit hard when the "wrong" price fluctuations occurred... when this happened, and in which direction the fluctuation happened, I cannot say.

After that, they were deeply into a fractional reserve situation and thought "hey, we own a large part of the Bitcoin market, we can probably play the price a bit to make our customers whole again, without anyone knowing what had happened."

In this way, they gradually drifted from "cutting corners and doing the ugly things required to keep a business afloat" (aka mild, veiled fraud) into outright fraud.

2
patio11 33 minutes ago 1 reply      
I'll try to find a print copy when the Yomiuri hits news stands on the 3rd and post the gist of it, but don't get your hopes up on this shedding a lot of light on the story.
3
butwhy 5 minutes ago 0 replies      
This is still a mystery? I thought it was widely accepted that Karpeles stole the money for himself to buy Starbucks.
3
Snapchat has raised $485M more from 23 investors
45 points by drum  4 hours ago   51 comments top 11
1
olalonde 2 hours ago 4 replies      
I have never used Snapchat so I might be off here but if I understand correctly, its main distinguishing feature is the ephemeral nature of content shared with it.

I find the concept of imposing artificial constraints on apps/interactions fascinating. Twitter was arguably the one to popularise the idea with its 140 character limit (perhaps by accident, the limit was initially there to support SMS) and now Snapchat (ephemeral posts). Are there any other apps that play on this theme?

Makes me wonder if there are other apps out there waiting to be "constrained". Here's some dumb ideas off the top of my head. What about a social network where you can only have 10 friends? What about an email inbox where you are limited to receiving X emails/hour (perhaps senders could bid on delivery priority?). What about a HN where you are only allowed to comment once a week? What about a continuous delivery system where you are blocked from releasing after you reach a quota of defects (I heard Google uses such a quota system internally)? What about a package repository which rejects packages with over 150 lines of code or some other quality metrics?

2
_almosnow 2 hours ago 3 replies      
I like Snapchat, I still can't grasp their plan to become the next great thing but at least they have a lot of users and their users 'don't want to leave'.

I always thought that Facebook were too big to fail, like in no one would close their account because of what they've invested there (friends, pics, etc...); yet, people are leaving it at an unprecedented rate. I remember Facebook desertion not being much more than a statistic even a year ago, now it's pretty common to encounter people that don't have an account there anymore. I think that the bandwagon effect that is behind the growth of online communities is a double-edged sword; once the trend shifts to users leaving, the more they leave the more users are likely to leave later and everyone snowballs out until there is no one left. Fortunately for Facebook, many of those peers left because of WhatsApp/Instagram, so business' still in the family... for now, albeit much less profitable. Sooner than later, FB will be gone and its place will have to be filled up by something else. Snapchat has a seat reserved in the post-FB era and apparently that is worth at least $20B.

Anyway, derailing the discussion a little, I'd like to hear what you'd think if 'Core Facebook' went out of business right now (but not WhatsApp/Instagram). Would you consider it a success or a flop? Was it a profitable endeavour or not?

3
elberto34 2 hours ago 6 replies      
This talk of bubbles reminds me of 2007 when everyone, including all the experts, was certain Facebook was a bubble at a valuation of $15 billion after Microsoft invested; now it's worth $200+ billion. Then in 2012 after Facebook's hugely publicized botched IPO and Nasdaq error, all the experts again said the web 2.0 bubble had burst; the stock price and earnings have since doubled. Unlike the big blowups of Friendster, Myspace, Digg, etc..these post-2008 web 2.0 valuations have proven to be extremely sticky. Pinterest, Twitter, Dropbox, Air B&B, Tinder, Snapchat, Whatsapp, Uber, Instagram...all keep going up with no end in sight, year after year until either IPO (which finally creates volatility) or buyout. There's hardly any big failures or blowups, except perhaps Zynga and Groupon (although it's still worth $5 billion). My prediction is these web 2.0 valuations will keep rising for many years to come because that is the path of least resistance, and the investor demand and user growth for these companies is seemingly unquenchable. The unending web 2.0 boom and unending wrong predictions about its demise show how these 'obvious' parallels to the old tech bubble of 1995-2000 are just so wrong. There's more at play here, such as the investor flight to quality (more money chasing fewer companies), huge user growth, huge monetization potential from smartphone engagement, the very large millennial population that use these services, and ability of these web 2.0 companies to carve out niche dominance and then keep it. Within the next year or two, we're probably going to see Uber being worth $100 billion before IPO, Snaphat $50 billion, Tinder $10 billion, Air B&B $50 billion, etc. Take every valuation and quadruple it. Back in the 90's, $100 million was a big deal; now that's just a rounding error or the equity of just a single early employee. Insane, but very prosperous times we're living in. And it's got a long way to go.
4
tonyjstark 1 hour ago 0 replies      
I always thought if you raise money the investors want the money back in the end. Maybe they want even more than they invested. So if a company raises more money than some of the old players of the game who actually having big revenues one has to think how much revenue the inverstors expect from this company in the next years.It seems like everybody only bets on Snapchat being bought by a bigger player which is a strange model of buisness case because there is no value but only assumptions generated. I believe (and that is very subjective) that that kind of investing is sickening the whole industry, it feels more like some sort of speculation which caused already problems in the banking sector. But it will be fine as long as the majority plays along.I found myself feeling rather conservative when I think about a valuation of one of those startups comparing them to other companies and for me that doesn't work out. Maybe it's because the whole market changed the last few years but maybe it's the b-word. I look forward to find out.
5
mehwoot 3 hours ago 1 reply      
Snapchat originally set out to raise $40 million, but demand for the round skyrocketed, and it decided to shoot for an ambitious $900 million instead. When that didnt work out, it dialled it back to $500 million.

Mind boggling.

6
forrestthewoods 3 hours ago 3 replies      
My rule of thumb for awhile has been that if you can get one hundred million users (100,000,000) you can sell your company for one billion dollars ($1,000,000,000). It doesn't matter if you have any revenue or not, 100m users = 1b dollars.

Snapchat is at 200m users, but has doubled since August. If you think it's headed for 500m users then 10b is only a 2x premium for an unusually large pool of users in one place.

WhatsApp sold for 18b with 500m users. It was headed for 1b users so 18b is a similar 2x premium.

At first I thought the math didn't work but I guess it does. Users are king. Engaged users are directly convertible to money.

7
moab 3 hours ago 2 replies      
I'm incredibly curious about how they're planning on monetizing, considering that that recent stab at pushing micropayments was a major flop. Ads that aren't full-fledged 'snaps' (and don't feel spontaneous) seem like a sure way of pissing off their userbase. Stories seem like an effort to push for fb's ads strategies, but whether this will pick up and become a real competitor is questionable.

Props to them for not getting acquired though, and pushing on.

8
nnain 1 hour ago 0 replies      
Snapchat, Tinder & Instagram, for one, brought innovative ways of interacting with the Apps, and they have very active teams. Most other social apps are me-too clones and quickly fizzle out.

It's still difficult for people sitting outside North America to comprehend these high valuations. Snapchat's CEO's leaked letters few days back provide an interesting peek into how he is thinking of bringing in revenues. To me, dropbox and youtube are more interesting case studies. It all seems quite simple now, but in their initial years, people wondered how they are going to monetize. But I have begun to see a clear trend in how the Social Apps, Sharing (Rental) Economy apps, and Ecommerce are behaving at different places across the globe.

100 Million users of a social app in US (followed by other western countries) are several times more revenue generating than the developing countries. The one metric that matters here is the Average Revenue per User (ARPU). Mobile advertising is growing in second and third world countries, but still lags behind. Not to say that users elsewhere are any less useful; Facebook has a huge focus on the Indian Market.

Apps for rental(sharing) economy, Uber et al, work more evenly everywhere, since they bring a straight cut out on the amount paid.

The segment that seems to works most at par globally has to be Ecommmerce. Amazon committed a $2Bn investment in India in 2014, as Flipkart got over 1.5Bn in funding.

9
auganov 1 hour ago 0 replies      
As long as Snapchat has a good case for maintaining the monopoly on ephemeral messaging the valuation is pretty reasonable. It's the best thing since IM and nobody else seems to get it. FB messenger could shake things up, but they seem reluctant. It's hard for any social/messaging incumbent to do it without cannibalizing their existing user activity. And they have 2 patents which may or may not be valid.My prediction is 1B users by 2016.
10
joeblau 2 hours ago 0 replies      
It seems like Yahoo! makes more money investing in other companies than it does as a company. They seem to invest lots of startups, including this one, that end up being huge.
11
chad_strategic 1 hour ago 0 replies      
With interest rates near zero for the last 6 years, what is really the value of money?
4
Hidden Costs That Engineers Ignore
51 points by gsands  7 hours ago   19 comments top 6
1
jonpress 1 hour ago 0 replies      
The problem is that as requirements grow, class structures are often kept the same but the 'glue logic' which operates on these structures becomes increasingly complex (you have to handle a growing number of edge cases).

There is a point where the glue logic becomes very complex (and brittle) - At that point, the best thing to do is to redesign part of the system's class structure.

If your components (at all levels of your class hierarchy) are specific about their own behaviors, it means that you can use simpler glue logic to make them work together.

When you need to handle a lot of complex use cases, it's often useful to have many specific classes which share the same interface and can be used interchangeably.

2
seivan 38 minutes ago 0 replies      
You can't do this for games. Sometimes a game needs a particular feature that has a ton of states and complexity.

The only thing you can do is1)Mitigate that by isolate it2)Separation of concern

You could have a component that's fuck-all complex, but as long as it's a single, isolated component, that can be disabled by

messedUpComponent.isEnabled = false, then I'm ok.

3
wavegeek 52 minutes ago 1 reply      
There is a good book about this. It changed my thinking about complexity. The key insight is that if you don't manage complexity you will drown in it. Complexity is OK as long as it really earns its keep. And a lot of complexity is not worth it. (I speak as a person who just spent 20 hours - less than the IRS's estimated 30 hours - filling in form W-8IMY for the IRS, a form which started as a bright idea by someone in congress).

"Conquering Complexity in Your Business: How Wal-Mart, Toyota, and Other Top Companies Are Breaking Through the Ceiling on Profits and Growth Paperback"by Michael L. George and Stephen A. Wilson

4
zedpm 4 hours ago 1 reply      
The notion of holding a Code Purge Day (as Quora did) is excellent. Searching out and removing cruft in the code base results in a benefit that's hard to measure but easy to understand. Having more compact code without confusing distractions leads to easier comprehension and ought to make for faster bug fixes.
5
stefs 1 hour ago 0 replies      
currently it looks like they ignored the hidden cost of being on the hn frontpage.
6
logicallee 3 hours ago 3 replies      
This article repeats a meme, but is total BS. It's similar to saying "the hidden costs of being a founder" and then stating that if you do ever make a successful exit, you will have a lot of your time wasted by having to select bedlinen and other hidden costs of having made an exit. (the hidden cost to being a founder - you will be forced to work unpaid hours shopping for luxury bedlinen. so consider carefully.) Well guess what: that's not real technical (or founder) debt!

The idea of calling this stuff technical debt is simply laughable. there's no debt. you don't owe shit.

hint: if you code something up in 20 minutes you still don't have to do all that other stuff, you can just ignore it, and throw away the twenty minutes of code if it's not better than not having it. if you become a millionaire you don't have to do all that stuff, you can just ignore it.

Let me put it another way. Say you're an MBA who can't code anything but excel formulas, yet you figured out how to get excel and powerpoint onto the web as a web app (wat). you create something and get 5,000 paying users and raise a $500,000 investment.

Have you created technical debt? No. You're at the same square as if someone gave you $500,000 to spend on developers against your mockups, except that you've validated them as well. There's no debt here. This article had it right:

http://www.higherorderlogic.com/2010/07/bad-code-isnt-techni...

The people that call this scenario (excel on the web) technical debt, think that somehow the MBA 'borrowed' the web app from a real dev, and now owes it in real development costs. That's a wrong way to think about it. In fact it's a ridiculous way to think about it.

5
A Magicians Best Trick: Revealing a Basic Human Bias
116 points by anigbrowl  9 hours ago   47 comments top 12
1
praptak 6 hours ago 2 replies      
James Randi did a similar trick, i.e. one even less reliant on mechanical tricks or a quick hand but rather on human bias.

Students in a class were given horoscopes in envelopes marked with their birth dates. The students were then asked to read the horoscopes and tell if they were accurate. Most were amazed how accurate the horoscopes were.

The trick is that every student had been given the same horoscope which was just cleverly written.

2
WalterBright 8 hours ago 2 replies      
I just finished reading "Thinking Fast and Slow" by Kahneman. He goes into great detail about our various cognitive biases, and how they constantly lead us astray. The most fascinating insight is how they still lead experienced statisticians astray in the same way, and even when these biases are pointed out to them! The mistakes keep coming.

We're not half as intelligent and rational as we believe we are.

3
click170 8 hours ago 0 replies      
If you enjoyed this you might be interested in reading up on James Randi. To quote Wikipedia, he is "a Canadian-American retired stage magician and scientific skeptic best known for his challenges to paranormal claims and pseudoscience."

There are numerous documentaries and films about him and his life, the one that got me started was a BBC Storyville episode: http://www.bbc.co.uk/programmes/galleries/p029bgws

It's not currently available on iPlayer apparently, though that restriction does not apply to torrent sites. Couldn't recommend it enough.

4
rickdale 7 hours ago 1 reply      
Magic is fascinating. Theres a magician that is in the first season on his tv show called Michael Carbonaro. The show is called the Carbonaro Effect. Basically, he sets up scenarios and goes on to fool people using magic. My favorite scenario that I saw was when he was showing people how to rebean ground coffee beans and the lady he was showing it to said, "Yeah I know how this works. I used to run a coffee shop."
5
eggoa 8 hours ago 3 replies      
Reminds me of this ancient web magic trick.

http://www.angelfire.com/ak2/intelligencerreport/page67.inde...

6
AndyNemmity 7 hours ago 2 replies      
I'm a magician, and there are many extremely interesting parts about human psychology that relate to lessons you learn. This isn't one of them.

This is a fine fluff piece about someone's experience watching a magician (and not a particularly good one from the tale). Nothing wrong with enjoying the story, but it doesn't reveal anything particularly interesting.

The psychology of the spectator is the most fascinating part of magic to me. I'm not sure how it would relate to "diplomacy, politics, finance and everyday life."

I know I use the same psychology in social situations, and public speaking that I do in magic, at times. It's hard to generalize because magic is an amazing field of a ton of areas.

I want to tell you what it is, why I use it, and details about it because it's a passionate hobby of mine, but magic is also the only field I've ever known where you aren't allowed to share what you do.

You build up a tremendous amount of skill, and then hide it. Guitarists show you what they do, amazing. Artists, Actors, Jugglers, Comedians..

Magicians are the only one where success is hiding your skill.

7
GmeSalazar 7 hours ago 0 replies      
This article reminds me of some videos I've watched on YouTube about paranormal "investigations". It's kind of funny how prone the "investigators" are to interprete random events as paranormal manifestations. Background noises are generally seen as EVPs[1] and they manage to match them to something they were expecting to hear; [2] provides a reasonable explanation for that.

[1] https://en.wikipedia.org/wiki/Electronic_voice_phenomenon[2] https://en.wikipedia.org/wiki/Apophenia

8
sparkzilla 7 hours ago 1 reply      
For a good example of confirmation bias watch Derren Brown's "The System" [1] where he makes a woman believe an unknown benefactor has a foolproof system for betting on horses. "Mail Order Prophet" [2] by Alfred Hitchock has a similar idea.

[1]https://www.youtube.com/watch?v=9R5OWh7luL4] [2]https://www.youtube.com/watch?v=zh0MvRagES0

9
baddox 7 hours ago 1 reply      
The dime-in-both-hands trick is a very clear example, but it also sounds so simple and obvious that I can't imagine many people falling for it, especially if they knew the performer was a magician. But perhaps I'm just so cynical that I would always be more concentrated on figuring out a magician's tricks than letting myself be entertained.
10
Kiro 6 hours ago 0 replies      
This is a great "prank". Anyone know of any similar ones? I don't even know what to call the genre. It's more mind bending than pranky.
11
boothead 8 hours ago 0 replies      
Anyone interested in the neuroscience behind magic should read Sleights of Mind http://www.sleightsofmind.com/. Really fascinating book!
12
trentmb 7 hours ago 0 replies      
I wore a spacesuit once. I wasn't an astronaut, just a drunk man in a spacesuit.
6
U.S. cancer deaths have fallen 22% since 1991
18 points by tokenadult  2 hours ago   11 comments top 4
1
nkangoh 20 minutes ago 1 reply      
It's very difficult to say if this is good news or not. Is it not possible that people are actually dying from other things that have actually increased since 1991 that occur before cancer? In other words, couldn't the other illnesses that occur before cancer explain this decrease if those have been increasing?
2
aosmith 1 hour ago 1 reply      
This is because of effective treatment. I was dx'ed with "fatal" cancer just over 10 years ago... Modern medicine FTW.
3
danieltillett 1 hour ago 4 replies      
So what cause of death went up?

Edit. To answer my own question it seems to be the very vague "All other causes" [1].

1. http://www.cdc.gov/nchs/data/databriefs/db88.htm

4
sbjustin 50 minutes ago 0 replies      
I suspect with no evidence to support that some of this may be related to people dying of other things instead on cancer such as heart disease, etc.

Another interesting fact is if you look at US deaths, fewer die between the age of 60 and 80 than before and after. Seems like if you can make it to sixty you'll die of old age or some random cancer.

7
What Do We Want, Really?
66 points by foolrush  7 hours ago   30 comments top 13
1
grondilu 4 minutes ago 0 replies      
I've recently watched a video titled "on the purpose of Life"[1], from YouTuber Robert Murray-Smith, who usually posts videos about tech and science, mostly graphene-related stuff.

He makes a nice point that trying to assign a purpose to everything we do, and to Life itself, may be a mistake. He takes animal behavior as a comparison. Animals usually don't act in order to achieve a high-level goal. They do things because that's what they are built to do. Most of their actions are consequences of basic impulses, emotions and instincts. Now I understand we are highly sophisticated mammals with big brains, and as such we are capable of acting on a rational basis towards an abstract notion of a goal often simplistically called "happiness", for instance. Yet we are still very much animals, so it may be very wrong to think all our actions should be based on this mode of behavior. If we do we may face the absence of a definitive answer to our quest for purpose, which would explain our tendency to jump into goals made up by religion, or into philosophical wanderings which can sometimes be quite unsettling.

1. https://youtube.com/watch?v=xc7kM0mSVfw

2
shutupalready 5 hours ago 1 reply      
We need to be thankful that pockets of extreme nonconformity exist to show an alternative to groupthink. People are so eager to mock or crush any diversity. Average people can't stand the idea that there are places in the world with extreme differences like (a) much higher or lower tax (using the U.S. as reference point), (b) drug attitudes far softer or much worse, (c) political structures (even the "evil" ones, speaking from the American POV), (d) far different cultural attitudes about sex, marriage, drinking, and every other moral issue. If the world didn't have all this diversity, we'd never know that there's an alternative to what we thought was the best way.
3
rdtsc 4 hours ago 2 replies      
Amish are very nice people, warm, friendly and with strong convictions. I was lucky to have been invited once to share a day with them (a visited a community in Ohio, had dinner, talked to them). But nevertheless ,the whole "no electricity" thing is crazy. Depending on the area there are workshops devoted in converting electrical appliances to pneumatic power. Yes they take things like blenders, food processors, pull out the electric motor put in an a pnumatic one. Then, of course, run compressed air (or vacuuum?) lines through their house. Some reintepret it as "you can't be connected to the mains" so they charge batteries then use the batteries to supply power to the house or machinery. Think about the brainpower and time spent into doing that. There is not logic or careful analysis of "advantages of electricity vs disadvantages" it is plain silly.
4
codingdave 4 hours ago 0 replies      
I think the comments so far are getting too caught up in the details of the article.

The point is not whether or not you like TV, or what choices the Amish have made. The point is to engage in a thoughtful decision process with technology, (or anything else in your life), and decide if it truly is helping you be the person you want to be, helping your family to grown on a personal level, and improving your community.

If you have truly thought it out and decided that any given tech is good for your life, great. But if you are just bringing technology into your life because it is new and shiny, you might want to consider stepping a level or two deeper in your decision process.

5
jostmey 5 hours ago 0 replies      
I like the article and and the point that it makes using the Amish people as an example.

But the Amish did not make a conscious decision to refrain from watching TV. No, the Amish shied away from electricity because of a religious belief that was in place before Television even existed. It is also worth noting that the Amish do not believe in education past 8th grade, and yet everyone would raise their hands if they were asked if they would want their children to go to college.

6
planckscnst 5 hours ago 2 replies      
I don't think my kids would be better off without TV, and I suspect most people don't actually think that, either.

TV (just as books) is a tool for delivering entertainment, education, and culture, and it's reasonably effective at that.

I'm glad I had TV when I was a child. The thing I value most is that it inspired an interest in science, math, and technology. Thanks, Square One, Bill Nye, Beakman's World, Discover, etc! Later on in life (teenage years), it taught me about time management; you can allow yourself to be entertained for many hours and not actually feel better for it, but actually worse, as you've lost that time; you can allow yourself to be entertained for an hour and it will change your whole outlook on the day. Always be aware of what you are gaining and giving up for entertainment.

Anyway, I could go on about why I love TV (and these days, the Internet), that's the gist.

7
tjradcliffe 4 hours ago 2 replies      
Claims like the one that "we" want what we cannot possibly have at the price "we" are willing to pay are tiresome. Who is this "we"?

I'm perfectly willing to pay the price of living the way I do, and so are a great many people here. Most of us have given the technologies we use some thought and selected the ones we use to maximize benefit and minimize cost.

There's no data presented in the article, just some guy's some informal impression that he can get away with accusing most people of hypocrisy. I dunno... I don't feel like a hypocrite. Do the majority of people here? My informal impression is they don't, but that's worth about as much as the informal impression in the article.

A more plausible reality is that we all have doubts that we've chosen well, when making choices of technology etc, and we're aware that both costs and benefits can be hidden and only show up at a later time. But that latent concern is quite different from what's being imputed by the article, which doesn't even get the Amish right: their rejection of many modern technologies is driven not by any consequentialist cost-benefit analysis, but by a deontological desire for plainness, self-effacement and submission to the rule or order of their anabaptist religion.

8
joshjkim 5 hours ago 0 replies      
I like this article - agree with the others that it gives the luddite concept a little too much love and doesn't properly recognize the huge benefits of mass media and tech, but still, the basic realization that self-determination in the face of enabling technology, media, culture etc. almost always requires real hard work and sacrifice is something worth contemplating - perhaps more importantly, interesting to consider that the utilization of all society/progress has to offer is in some cases the opposite of self-determination.

I might keep reading this blog.

9
Apocryphon 3 hours ago 0 replies      
I don't think the core of this article is about technology at all. My main takeaway is that often we imagine that the futures we're working towards can be accomplished through incrementalism. But it really requires far more radical action than that.

This can also be applied to self-improvement.

10
887 2 hours ago 0 replies      
The question on the bus reminded me of the talk stallman gave at the 31c3.

You can watch stallmans talk here for reference:http://cdn.media.ccc.de/congress/31C3/webm-hd/31c3-6123-en-d...

11
hartator 5 hours ago 2 replies      
Until you get sick and you are happy to have access to modern medicine. #Sight.
12
elwell 4 hours ago 0 replies      
I just watched episode two of Black Mirror (on Netflix). It shows an interesting world in which tech has really taken over the human.
13
robbrown451 3 hours ago 0 replies      
Does a horse and buggy go fast enough to notice the Doppler effect?
8
The End of Gangs
122 points by ern  16 hours ago   40 comments top 9
1
corysama 9 hours ago 2 replies      
2
joshuahedlund 9 hours ago 0 replies      
For anyone interested in urban gang dynamics I highly recommend David Kennedy's book Don't Shoot. I recently read it and it contains some amazing insights into misunderstandings that communities of law enforcement and urban neighborhoods have about each other and gangs which leads to rational but wrong behaviors that perpetuate those misunderstandings, along with some brilliant ideas to change those dynamics with examples of those applied ideas working in multiple neighborhoods across the country.

http://www.amazon.com/Dont-Shoot-Fellowship-Violence-Inner-C...

3
mathattack 5 hours ago 0 replies      
This has created the only-in-L.A. phenomenon of commuter gangs: guys who drive a long way to be with their homies at the corner where the gang began. (In the 204th Street neighborhood in the Harbor Gateway, I met gang members who drove in from Carson, the San Gabriel Valley, and even Palm Springs.)

The joys of living in California. I wonder if this will happen in East Palo Alto too.

4
learc83 8 hours ago 1 reply      
What about decriminalizing possession of small quantities of marijuana that went into effect in 2011? Or medical marijuana that is also fairly recent.
5
cranklin 2 hours ago 0 replies      
In my opinion, it's simpler. Gang culture is cyclic. It's simply not "cool" to be a gangster nowadays, therefore insecure kids don't aspire to be gangsters. But like all things, in time, it will be "in" again.
6
techwatching 9 hours ago 1 reply      
I wonder to what degree technology has played a role. The article references gang taunting etc. via the internet instead of on the street - would be interesting to know to what degree social networks, video games, etc. have had an impact on gang activities, and also the degree to which drug sales have moved off the street - i.e.: you don't need people hanging on the corner to sell drugs anymore.
7
spydum 8 hours ago 4 replies      
perhaps it's the rise of social networks, facebook, flappy bird, etc which has neutralized all gangs.. kids nowadays can't put their phones down, get off the couch. they've gotten lazier?
8
spiritplumber 8 hours ago 1 reply      
Now gangs are bigger and their members wear badges. Sort of like the ending of A Clockwork Orange...
9
zo1 3 hours ago 0 replies      
Perhaps less gun ownership was needed when gang-violence occurred less? If they do correlate, I'm not entirely sure we could figure out the chain of causation without jumping to conclusions.
9
Arctic Fibre Project to Link Japan and U.K
47 points by mmastrac  7 hours ago   8 comments top 3
1
mytochar 3 hours ago 2 replies      
For those of you hunting to see the length of the cable like I was, it's 15,600 km long. I wish they'd provided a top-down view rather than a lateral view like they did, since that would make more sense and have the image look shorter.

Anyway...

I wonder if this will have a positive impact on the arctic communities that the cable goes through. Is there any history on technological booms and hubs appearing in places where a new cable went through previously low-occupancy areas?

2
jedberg 5 hours ago 0 replies      
This is great news for Alaskans. The internet there is terrible and ISPs do a lot of caching that they shouldn't to make up for it (like setting the min TTL for all DNS entries to 7 days regardless of what they get from the server).

Finally they'll be able to get decent internet without all the shenanigans!

10
Ruby talks of 2014
134 points by lackoftactics  12 hours ago   16 comments top 6
1
Aeolus98 18 minutes ago 0 replies      
It's interesting to note just how applicable these talks are to other languages. That database one is something i'd recommend a high schooler listen to for their next hackathon.
2
potomak 12 hours ago 2 replies      
In my opinion one of the best Ruby talks of 2014 is "Refactoring Ruby with Monads"[0] by Tom Stuart held at Barcelona Ruby Conference.

[0] https://www.youtube.com/watch?v=J1jYlPtkrqQ

3
tcopeland 11 hours ago 0 replies      
Here's my list of (mostly) Ruby-related good talks from this year:

http://thomasleecopeland.com/2014/10/18/good-technical-video...

Andrew Turley's "What we can learn from COBOL" talk is probably my favorite.

4
llamataboot 5 hours ago 1 reply      
I'll put a humble plug in for my talk "Care and Feeding of your Junior Developer" from Nickel City Ruby. It was on top of Confreaks for a few weeks and people seemed to like it, so if you are a junior, or you mentor juniors, check it out.

http://confreaks.com/videos/4659-nickelcityruby2014-how-to-b...

5
danso 11 hours ago 4 replies      
That this list got upvoted to the top spot must be some sign that the HN audience is still enthused about Ruby-related stories (I say that has a Ruby fan myself)...It's a good list and I'll bookmark all of its items, but it is pretty brief, both in items (5) and in the summarization (and the use of the overexcited clich "killer" in the original title makes me extra curmudgeonly about it).

Or are people really enthused about talks as a way to learn things? I haven't been to a talk in awhile and I just can't get into video learning. So FWIW, I'll throw in the Ruby-related-infothing that I was most excited about this year: the publishing of "Metaprogramming Ruby 2:" (https://pragprog.com/book/ppmetr2/metaprogramming-ruby-2)...The original version was one of the most helpful books to me as a Ruby beginner in 2010, both in learning the language and learning new ways to think about programming.

6
meesterdude 9 hours ago 1 reply      
Wow, sandi's talk was really interesting. She's working on a rails book so i'll be interested to see her approach to rails.

I'm pretty good with "small methods" but her refactor from that to "small objects", while more flexible, seems like its more complicated in the end. I mean if you're going to have a lot of those it makes sense, but I feel like it could equally be called premature; but I respect that it is more tolerant to future unknown changes. "small methods" is also easier for a junior developer to support, and i think that should be taken into consideration.

11
Static Linux
115 points by joseflavio  16 hours ago   41 comments top 8
1
ultramancool 9 hours ago 3 replies      
Cool idea. Not enough people seem aware of executable packers like UPX http://upx.sourceforge.net/ though.

These are excellent tools to keep the size down when using large static binaries. By compressing the file on disk and decompressing in memory you often wind up with a smaller and sometimes even faster loading (depending on disk IO speed vs decompression speed) package. I got a static Qt binary from 4 MB down to 1.3 with upx --lzma. Very nice stuff.

2
stonogo 10 hours ago 1 reply      
I have been keeping an eye on this "project" for years and have yet to see anything come of it except lightning talks.

suckless.org seems to focus on their web browser and their xterm clone these days, judging by the listserv traffic.

3
zackmorris 7 hours ago 3 replies      
I wish Linux would replace dynamic libraries (especially ones referencing specific paths) with a system based on the library's hash. Then we could have a single lib folder with 1 copy of each library, and get ourselves out of dependency hell by just making dynamic loading act like static loading. We could even download libraries from the web on the fly, if needed. Heck it would even remove a lot of the need to compile every_single_time because apps could reference binaries.

The notion of being able to fix an app by merely upgrading a library it depends on has not worked out in practice. More often than not, when I upgrade a library, I find myself having to upgrade my apps code because so much has changed. The burden of having to constantly backup, upgrade, manually tweak config files, over and over and over again for days/weeks/months was SO not worth the few hundred megabytes or whatever dynamic loading was supposed to have saved.

4
curlyquote 9 hours ago 1 reply      
Static linking OpenSSL is probably not a good idea
5
dfc 7 hours ago 2 replies      

  > Because dwm is customized through editing its source code, its  > pointless to make binary packages of it. This keeps its userbase  > small and elitist. No novices asking stupid questions.
I never understood why the authors of dwm thought this was a "nice" feature of configuration via source code.

6
leakybucket 7 hours ago 1 reply      
An advantage of dynamic libraries is that the memory used to hold the library's executable pages can be shared across processes. So using static only binaries will lead to less free memory on the OS.
7
proveanegative 9 hours ago 1 reply      
Are there any static BSDs?
8
spiritplumber 10 hours ago 1 reply      
Wonder if it helps with dep hell...
12
MRI Developers Don't Use RubySpec and It's Hurting Ruby
216 points by jc00ke  8 hours ago   82 comments top 17
1
nateberkopec 6 hours ago 4 replies      
Rubyspec is a great project for all the reasons Brian outlines.

However, it's also a failure - in part, due to Brian and his attitude towards contributors. See this twitter conversation: https://twitter.com/the_zenspider/status/547527644535726080 He's been grinding on Rubyspec for years, bless him, but I think there's a reason why he was unable to rally the community behind his effort, both in terms of gathering more contributors and in making it "official" in terms of the language spec.

JRuby, currently the only non-MRI ruby implementation that you can seriously consider for production use, runs the MRI test suite against JRuby. RubySpec is far from The Only Solution, though Brian would like you to think it is.

2
vorg 50 minutes ago 0 replies      
It's disappointing to hear about such problems with Ruby. I hit the same problem with the Groovy Language spec when I first came across Groovy. Its creator, James Strachan, initiated an implementation, test kit, and spec all within 6 months of each other (impl beta-1 in Dec 2003, and spec JSR-241 in May 2004). The project managers who took over from him, Graeme Rocher and Guillaume Laforge, changed direction by stopping work on the spec and refocusing the Groovy reference implementation to be the scripting language behind Grails. (Of course, Groovy 'n' Grails was intended to chisel away at some of the market share of Ruby on Rails but that's another story.) Strachan often wrote that the spec was to enable anyone to make their own implementation of Groovy if they want to, and right up to his very last posting ever http://groovy.329449.n5.nabble.com/Paris-write-up-tt395560.h... on the Groovy mailing list on 5 Dec 2005, he maintained that what they were building was the reference implementation.

If Rocher and Laforge had come clean about how they turned the RI into the language itself, the backlash might have blown over quickly, but instead they led developers along for many years afterwards, not changing the spec to dormant until April 2012. Projects other than Grails who've tried to build atop Groovy have had to risk the ref impl changing in breaking ways between versions. The most spectacular incident was when Groovy++, an experimental static compiler built by Alex Tkachman that hooked via annotations into Groovy's AST, had to drop back down from Groovy 1.8 to 1.7 in 2011, and my own side project was also affected by the change. It turned out Rocher and Laforge had secretly employed a mate to extend Groovy with the exact same static type-checking and compilation functionality as Groovy++ and were obviously trying to shake us off.

Unlike Ruby, Groovy only has one other implementation, GrooScript, built by Jorge Franco, which generates JavaScript from Groovy syntax. When the developers of the most used implementation of a language want to protect their control, it certainly does hurt the ecosystem, turning it into an "echo system".

3
lgleason 5 hours ago 2 replies      
Both Charles Nutter and Matz are smart guys. They are also both great people. I say this having spent time with both of them. This whole, my implementation is better than theirs coming from Brian is crazy.

Matz and Charles have helped to set the tone for the community. While I appreciate Brian's passion, it sounds like he needs to check his ego. No matter how smart we are, we can always learn things from other people. Without the collaboration of others neither JRuby or Ruby would be what they are today which is why they are successful.

4
bhrgunatha 1 hour ago 0 replies      
> Later that year, at RubyConf 2008, I gave a talk titled, What Does My Ruby Do about RubySpec. Matz and several other MRI developers attended. Immediately after my talk, contributors to Rubinius sat down with Matz and other MRI developers to discuss their effort to create an ISO specification for Ruby. We asked whether RubySpec could be part of the specification but were told that it was not appropriate to include it.

This seems telling.

Does anyone have any concrete information about why Matz and the Ruby team are opposed to using RubySpec then?

Has there been any progress on creating an ISO spec?

EDIT: It seems Ruby has a published ISO spec since April 2012 - http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_...

5
sams99 7 hours ago 3 replies      
The thing I find depressing and frustrating about this kind of discussion is the fatalism and non-constructiveness.

I really wish it was:

"I set up a server that runs ruby spec on ruby-head daily and automatically reports spec failures to ruby-bugs"

So many companies are making big bucks off Ruby, yet so little are willing to fork out a bit of money and time to make Ruby better.

6
lnanek2 5 hours ago 0 replies      
Matz is the creator of Ruby. If this guy really wanted to play ball, why didn't he just submit PRs for improving the real Ruby's test suite? Instead he just made his own thing, well duh it didn't end up a part of the real Ruby. OK, a hosting provider wanted their own Ruby implementation to fix concurrency issues, but that doesn't suddenly make the creator of the language have to do things their way. If you fork or reimplement a project and do whatever you want, well you took control so you gained something, but the creator has no obligation to change to your fork/reimplementation.
7
tessierashpool 4 hours ago 0 replies      
I don't want to get in the middle of drama, but if you take anything away from this, read the actual code.

Just read the actual code of the MRI tests.

8
Terr_ 7 hours ago 1 reply      
Regardless of other warts with the language, I really love how the Java Language Specification lays down the law.
9
msie 2 hours ago 0 replies      
What of the ISO standard that mRuby is based on?

http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_...

Aw crap, I have to buy this doc to see it?

10
hartator 7 hours ago 1 reply      
Has Matz stated the reasons they are not using RubySpec? Knowing him, I can't really believe he has chosen to ignore RubySpec without good reasons.
11
danielweber 7 hours ago 1 reply      
Why should they have been using RubySpec?
12
joevandyk 7 hours ago 1 reply      
Why don't MRI developers use RubySpec?
13
dyeje 7 hours ago 1 reply      
Some of the discussions he links to in the article are disturbing. Many posts refusing to implement process because they don't want any process at all. Doesn't seem like a healthy approach.
14
jes5199 6 hours ago 1 reply      
I almost didn't catch the epilogue - he's shutting down the RubySpec project entirely, because it hasn't accomplished what he hoped.
15
serve_yay 7 hours ago 0 replies      
Jeez, what a mess.
16
jc00ke 7 hours ago 0 replies      
I mistakenly cut off the full title of the post: "Matz's Ruby Developers Don't Use RubySpec and It's Hurting Ruby"
17
jagawhowho 6 hours ago 0 replies      
Ruby is about to die permanently. A language that copied emacs-lisp but with syntax? Lol that gives it features to impress the ignoramus majority but real Haskell or Lisp programmers know better.
13
The CPAN Pull Request Challenge
166 points by eperoumal  16 hours ago   41 comments top 16
1
perlgeek 14 hours ago 1 reply      
I'm in.

Some years ago I tried to motivate people to contribute to Perl 6, and found that while many had some lingering interest in doing so, they needed some steering.

This was hard for me to do, because usually in Open Source communities, you aren't supposed to tell people what to do; they are free to chose the occupation after all. But I found that it worked very well.

So I think the CPAN Pull Request Challenge is a very good approach to steer people to particular projects, without causing too much work for those who steer. At least it's a very good experiment to try.

2
kaens 14 hours ago 1 reply      
Count me in! The last time I wrote Perl, the Camel Book was a thing and I thought it was cool to have the least number of digits in your slashdot id as possible.

I find it pretty fun to try to modify code that lives in a world that I'm largely not part of, this is a nice little impetus!

3
schnevets 14 hours ago 1 reply      
I'm in. People around here are sometimes cynical of Perl, but there's a lot of high quality code on CPAN that definitely shouldn't fall by the wayside.
4
sivoais 10 hours ago 0 replies      
You never know when that pull request could turn into co-maintainer permissions. ;-) That happened to me with 3 packages last month!
5
esaym 11 hours ago 0 replies      
Add don't forget to use http://www.cpantesters.org/ so you can see how your test cases run on a variety of platforms.
6
ASUmusicMAN 9 hours ago 0 replies      
What a fantastic idea. I think this will really speak to those of us that may need a little nudge before feeling comfortable. I've dusted off my PAUSE ID and signed up.

Does anyone know of any other langs/communities that have tried something like this? This seems like a great idea to learn and contribute.

7
fibo 11 hours ago 0 replies      
I am in, nice iniciative.I've just released https://metacpan.org/pod/Task::BeLike::FIBO and I'm quite glad of my minimal style.Perl is for servers like JavaScript for the web, it is obiquitous.I like other languages too, but, Perl is really fun specially for the community, a great one!
8
edibleEnergy 11 hours ago 1 reply      
In as well. Hope I don't get something like SOAP::WSDL :)
9
wenderen 11 hours ago 2 replies      
I'm interested, but I don't know Perl. Is knowing Perl a hard prerequisite?
10
estrabd 14 hours ago 0 replies      
I signed up. Sounds like a good opportunity for those who want to help, but don't quite know where to get started.
11
creaktive 10 hours ago 1 reply      
1. Join the challenge2. Get any module assigned3. Check the gaps in it's test coverage using https://metacpan.org/pod/Devel::Cover4. MOAR TESTS5. Profit!
12
senorsmile 8 hours ago 0 replies      
I'm in. Been looking for an excuse to use my budding perl-fu.
13
kolom 14 hours ago 3 replies      
Sounds great, is there anything like this for php?
14
rohall 12 hours ago 1 reply      
Sounds cool, I'd do something like this if it was in Ruby. Haven't used perl in several years, not sure I'd be much help
15
twerquie 14 hours ago 2 replies      
> The goal is to help others, possibly learn something, and hopefully have a bit of fun

Seems like it shouldn't be limited to CPAN modules.

16
nsxwolf 14 hours ago 2 replies      
Sounds desperate.
14
What Happened in 2014?
59 points by ninago  13 hours ago   21 comments top 5
1
UweSchmidt 6 hours ago 5 replies      
"8/ we finally got rid of files. dropbox, google drive, soundcloud, spotify, netflix, hbogo, youtube, wattpad, kindle, and a host of other cloud based services finally killed off three letter filenames"

This is amazing and I guess I'll need to meditate over this fact (as I still like those files on my computer, feels as if I own them).

For now I'd still recommend holding on to your important stuff and considering the nature of the relationship between you and your cloud provider.

2
petercooper 7 hours ago 1 reply      
I've always admired Fred's way of laying things out without any nonsense and seemingly quite transparently. I'm very keen to found out what he thinks 2015 will bring tomorrow (and anyone else in the know, really..)
3
crimsonalucard 1 hour ago 0 replies      
People will only know what really happened in 2014 when it's 2017. It's too soon to say whether something is just a fad or an actual paradigm shifting trend.
4
sysk 2 hours ago 0 replies      
I love hating on VCs as much as the other guy, but I must admit I'm envious of the ability of Fred Wilson (and VCs in general) to zoom out and see the bigger picture. I wonder if that's a skill that can be learned/taught.
5
jobu 7 hours ago 3 replies      
"7/ youtube became a monster. it always has been. but in 2014 youtube emerged as the place for entertainment consumption for anyone under 16"

This seems so huge to me - maybe it's because I have kids and saw it happen firsthand over the course of this year.

15
The two cultures of mathematics andbiology
97 points by Fede_V  12 hours ago   48 comments top 12
1
pvaldes 35 minutes ago 1 reply      
In fact is not related with mathematics or biology. Is just a literary problem. They are not paying enough attention to the inner flow of their work.

An example:

"This is the field where one STUDIES the locus of

solutions of

sets of

polynomial equations

by combining the algebraic properties of

the rings of

polynomials with

the geometric properties of

this locus,

known as a variety.

Traditionally, this HAD MEANT complex solutions of

polynomials with

complex coefficients but

just prior to Grothendieck's work,

Andre Weil and Oscar Zariski HAD REALIZED that

much more scope and insight WAS GAINED by

considering solutions and

polynomials over

arbitrary fields,

e.g. finite fields or

algebraic number fields."

........This is "almost lisp code" in its structure.

The inner flow here is really dislocated here with all those interruptions and changes of direction and meaning. There is also a problem with the timeline. They talk in the same paragraph about at least four, maybe five different moments in time, shown in this order: 5,1,4,3 and maybe 2 (being 1 the oldest and 5 the current time). This is driving to distraction to the reader probably (I find it pretty annoying at least)

2
IndianAstronaut 11 hours ago 4 replies      
>The result is that biology loses out due to the minimal real contact with math the special opportunity of benefiting from the extra sense is lost, and conversely math loses the opportunity to engage biology

When I was a biology grad student, I was the only person in the department that tried to do active collaboration with the math department. I took more math classes during my graduate program than biology classes.

On the biology side, I got ridiculed for all the 'hand waving' that seems to happen with the math. Biologists want to see concrete experiments and results.

On the math side I found people to be much more open and fascinated by the biology, but they had a tough time explaining what they were doing to a lay audience.

No easy answers, but I think more programs should graduate people with dual skills in both subjects(and of course have job opportunities for those grads, instead of having them jump into industry like I did).

3
mbq 10 hours ago 1 reply      
I once was on a workshop organised by and for mathematicians which supposed to be about biology. There was a talk where guy was discussing PDE system describing some colony growth; at some point near the beginning he concluded that the results for real input are "boring", and one can get "intriguing chaotic behaviour" with negative cell number, and continued with that assumption (;
4
chubot 8 hours ago 1 reply      
Meh, I think he just didn't explain what a "scheme" is very well. Compare it the NY Times piece. From that, I immediately understand that it is an elegant generalization of solutions to polynomial equations.

http://www.nytimes.com/2014/11/25/science/the-lives-of-alexa...

Compare it to:

''The proper foundations of the enlarged view of algebraic geometry were, however, unclear and this is how Grothendieck made his first, hugely significant, innovation: he invented a class of geometric structures generalizing varieties that he called schemes. In simplest terms, he proposed attaching to any commutative ring (any set of things for which addition, subtraction and a commutative multiplication are defined, like the set of integers, or the set of polynomials in variables x,y,z with complex number coefficients) a geometric object, called the Spec of the ring (short for spectrum) or an affine scheme, and patching or gluing together these objects to form the scheme. The ring is to be thought of as the set of functions on its affine scheme.''

Sorry, this writing is just awkward. I can see why the editors of Nature rejected it.

I think the audience of Hacker News is closer to mathematics than the audience of nature (programming being more closely related to math than biology). I doubt that anyone in this thread unfamiliar with schemes was able to get much useful from Mumford's obituary. I don't even see many comments on the exposition.

5
elberto34 11 hours ago 1 reply      
I think they were right to reject it. I have done plenty of math but I still couldn't 'grasp' the significance of the scheme system, and while I'm sure it is very important to the few people who understand it, I doubt the readers of Nature are among them.
6
aabajian 11 hours ago 3 replies      
Perhaps even more interesting is the gap between mathematicians and medical doctors. Disclaimer: I was a math major who went to medical school.

There is age-old question of, "What should I major in if I want to go to medical school?" Turns out that mathematics majors have the following statistics:

1. Highest average MCAT Physical Sciences Scores2. Highest average MCAT Biological Sciences scores (higher than biosci majors)3. Second-highest MCAT Verbal Reasoning scores (second only to Humanities majors)4. Highest overall average MCAT scores.5. Second-Highest average Science GPAs (biosci majors are 0.02 higher)6. Highest average Overall GPAs

Source: https://www.aamc.org/download/321496/data/factstable18.pdf

'Course, math majors make up < 1% of medical school applicants (0.81% to be exact), so this very well may be selection bias. Still, it seems as though what medical schools are looking for are individuals with analytical (mathematical) reasoning skills.

EDIT: It's also worth noting that in many countries an undergraduate education is not a prerequisite for medical school, so there's likely to be even less math-major physicians outside of the US.

7
jmount 9 hours ago 0 replies      
8
epistasis 11 hours ago 0 replies      
Lior Pachter is one of the great thinkers and communicators in computational biology. I never miss one of his talks, they're always energetic and enlightening.
9
dang 10 hours ago 0 replies      
Mumford's Grothendieck obituary was posted to HN a couple weeks ago but unfortunately didn't get any discussion.

https://news.ycombinator.com/item?id=8755800

10
impendia 11 hours ago 2 replies      
>Mathematics full professors that are female is a number in the single digits.

I'm not going to let this slide. This would be scandalous if it were true; however:

http://www.mi.uni-koeln.de/Bringmann/https://www.math.psu.edu/wli/https://web.math.princeton.edu/~smorel/http://www.math.tamu.edu/~ptretkoff/http://en.wikipedia.org/wiki/Maryam_Mirzakhanihttp://math.uchicago.edu/~wilkinso/http://www.math.wisc.edu/~lsmith/http://www.maths.ox.ac.uk/people/frances.kirwanhttp://math.stanford.edu/~ionel/http://gauss.math.yale.edu/~ho2/

I could easily keep going.

Edit: As ninguem2 points out, apparently the author was referring to the percentage of female math professors. Nevertheless, if you do not count only doctoral-level departments, even that appears to not be true:

http://www.ams.org/profession/data/cbms-survey/chapter4.pdf

(This study is counting all tenured professors, rather than full professors only, but the proprortion is well enough north of 10% that I feel it's safe to extrapolate.)

Of course, the proportion of female math professors is terribly and inexcusably low, but the situation is at least slightly less bleak than is painted here.

11
Balgair 10 hours ago 1 reply      
Bashing Nature is almost cliche at this point, anyone in the research world knows about the issues there by now. However this is comical:"...in which he describes the rejection by the journal Nature of an obituary he was asked to write"I mean, who in their right mind rejects an obit that they asked someone to write? The power that the Nature editors wield is awesome in the ivory tower, and it has gone to their heads.

That said, I was a mathy undergrad and now am in a neuro PhD program. The gulf is large indeed. I think the largest difference for me is the relation to science in general. As a mathy person, we are all about the predictive powers of science. I do A, then B happens at time T. In bio, it is not that at all. Bio is an observational science. I see A, then I see B at time T. Sure, you can make predictions, but what these events all have to do with each other is almost impossible to predict in a living organism/environment. As such, when bio people hear Partial Differential Equation, they go running for the hills.

Case in point, PDEs are no big deal for me, I took an entire class on them. But in one class we had to read a paper on using PDEs to model genetic interactions with a sugar input and then write up 1 single page on it (with some guidelines). Oh man, the riot! 59 of the other people in the class were up in arms about this. They tried to get the points on the paper halved, then eliminated, then the teacher to rescind the assignment, which they were all successful in doing. Then the non-stop complaining ensued for weeks in the halls. All because we had to read a paper with PDEs written out in it. My lord.

On the plus side, it leaves a huge hole that none of the bio people want to crawl down. This is a positive for mathy people, as the bio is ore memorization than anything. The bio field is rocking and rolling already from the intrusion that mathy people are mediating. Now, if your lab does not have a CS major in it, you are going to fall behind. The idea that quants and big data people are necessary is just starting to grab hold of the bio world. Now is a good time to get into grad school in the bio field if you have a math background as they are just now starting to realize they need you. It's just hard to get through classes though.

12
throwawayxx 10 hours ago 1 reply      
I started a PhD on the boundaries between mathematics and biology.

I quit soon after finding out that my (mathematics) supervisor had been actively blocking me from communicating with biologists, including blocking meetings with my (biology) co-supervisor.

I gather they thought real, practical concerns would be a distraction from the purity of theoretical problems.

16
Scaling CloudFlare's Massive WAF
46 points by mxpxrocks10  8 hours ago   4 comments top 2
1
mxpxrocks10 7 hours ago 2 replies      
Also, I want to point out that many people at Cloudflare were involved with the optimization of the WAF at Cloudflare including @agentzh https://twitter.com/agentzh He also did a fantastic presentation at nginxconf!
2
puppetmaster3 8 hours ago 0 replies      
I was always wondering what CloudFare does.
17
Winklevoss Bitcoin Trust
121 points by markmassie  12 hours ago   83 comments top 19
1
Animats 7 hours ago 1 reply      
This is not a way to "grow Bitcoin". It's a way for a big holder to dump a lot of Bitcoins without, they hope, crashing the market.

The terms are awful: "The Shareholders limited rights of legal recourse against the Trust, Trustee, Sponsor, Administrator, Trust Agency Service Provider and Custodian and the Trusts lack of insurance protection expose the Trust and its Shareholders to the risk of loss of the Trusts bitcoins for which no person is liable."

"The Trust will not insure its bitcoins. The Custodian will maintain insurance with regard to its custodial business on such terms and conditions as it considers appropriate in connection with its custodial obligations and will be responsible for all costs, fees and expenses arising from the insurance policy or policies. The Trust will not be a beneficiary of any such insurance and does not have the ability to dictate the existence, nature or amount of coverage. Therefore, Shareholders cannot be assured that the Custodian will maintain adequate insurance or any insurance with respect to the bitcoins held by the Custodian on behalf of the Trust. Further, Shareholders recourse against the Trust, Custodian and Sponsor under [New York] law governing their custody operations is limited. Similarly, the Shareholders recourse against the Administrator and Trust Agency Service Provider for the services they provide to the Trust, including those relating to the provision of instructions relating to the movement of bitcoins, is limited. Consequently, a loss may be suffered with respect to the Trusts bitcoins which is not covered by insurance and for which no person is liable in damages."

I've never seen terms this unfavorable to shareholders in a prospectus before. They're taking on less liability than Mt. Gox took on. If the Bitcoins mysteriously disappear, no one is liable.

2
ucha 10 hours ago 2 replies      
If the SEC approves it, this would be great news for bitcoin. Assuming the ETF is sufficiently liquid, it would allow:

- easy shorting of bitcoins which facilitate price discovery

- lower transaction costs. The cheapest and most liquid exchanges still charge .2% per transaction + spread. Most (all?) of them charge you for getting cash in our out of their platform. Buying shares of an ETF would cost just spread + transaction cost charged by your broker which should be much lower (.0035 per share on Interactive Brokers for example)

- easy hedging of a real bitcoin position. Let's say you hold a large fluctuating position in bitcoin that would like to hedge in USD. You could continuously convert all your BTC to USD or go long/short the ETF which is much cheaper.

3
anigbrowl 4 hours ago 1 reply      
I don't think it matters. Per blockchain.info, the volume of Bitcoin transactions is in gradual decline over the last year and seems stuck at about $50m USD/day, despite many more merchants offering to accept payment. Market cap is fairly steep decline and hash rate has been leveling off.

https://blockchain.info/charts/estimated-transaction-volume-...https://blockchain.info/charts/market-caphttps://blockchain.info/charts/hash-rate

It just struck me that the market cap trend seems to have gone down in very similar fashion tot he price of oil over the last 6 months. If enough people who bought bitcoin did so primarily as a hedge, then you'd expect it to loosely track a basket of popular commodities like oil and gold (the price of which looks quite similar to Bitcoin's market cap over the last year IMHO - http://goldprice.org/). Can't wait for Google to get their automatic statistician tool online - I don't like statistics well enough to want to use R regularly but I would love a tool that I can use to quickly measure the coupling between different datasets.

4
dnautics 7 hours ago 1 reply      
By being in the general markets, bitcoin will finally have mass access to leveraged trading. The result will be that bitcoin will finally see a 'true bubble'. While the bitcoin price has been decreasing, it's not really been a bubble as the popping of a bubble is usually twice as fast as its inflation, the opposite of which is true in the current bitcoin decrease.

A good resource on the connection between leverage and financial bubbles is Kindleberger's "Manias, Panics, and Crashes"

5
minimax 10 hours ago 0 replies      
This is just a new version of the S-1. The SEC still hasn't approved the ETF for sale.
6
justinireland 2 hours ago 0 replies      
Doesnt the ETF also open the door to institutional funds that are normally restricted to specific assets? Seems to me that is the biggest advantage of a bitcoin ETF as it will open the gates to more capital for bitcoin investments.
7
apaprocki 10 hours ago 5 replies      
One thing sticks out as a red flag to me: they invented their own spot index (the Winkdex(R)) to price their NAV and that index includes BTC-e. BTC-e is a widely used site in the Bitcoin world, but no one knows who operates it or exactly where they are located (Bulgaria? Russia?). You would seriously base a large component of your index pricing an SEC regulated instrument on a number coming from unknown individuals who can not sign a contract or accept any liability? When people in the Bitcoin world always wonder "Why did X not include BTC-e?? How incompetent!" they never stop to think that there is no one on the other side that can pick up the pen.
8
califield 11 hours ago 1 reply      
They're going to trade Bitcoin under the NASDAQ symbol COIN. I love it!
9
murbard2 6 hours ago 0 replies      
It's not about the liquidity.It's not about shorting.It's not about leveraging.It's a little bit about ease of investing.It's a lot about the fact that once this is available, hordes of brokers can make commissions by recommending their clients buy into this ETF.
10
ssharp 10 hours ago 6 replies      
Is this the only way to cash out a large amount of BitCoins?
11
Kiro 7 hours ago 1 reply      
> In March 2014, it was announced that the twins had purchased seats on Richard Branson's Virgin Galactic shuttle using the profits they had made from Bitcoin. [1]

I wonder how much bitcoins they own.

[1] http://en.wikipedia.org/wiki/Winklevoss_twins#Bitcoin

EDIT: From the top of the article: "In April 2013, the brothers claimed they owned nearly 1% of all Bitcoin in existence at the time."

12
jekrb 10 hours ago 1 reply      
If you have access to the dev console I highly recommend setting the max-width of the body to 40em. The text spans all the way across the screen by default.
13
bobcostas55 7 hours ago 0 replies      
I think it's really sad that Bitcoin trading ended up being so ridiculously expensive to trade that an ETF listed on traditional markets will drop the costs by an order of magnitude.
14
pnathan 10 hours ago 4 replies      
Interesting. If accepted, I am tempted to buy a few shares and see what falls out over time.
15
foobarqux 7 hours ago 1 reply      
The real problem with the ETF is that the index used for pricing is not independent.
16
elwell 8 hours ago 0 replies      
> as measured by the Winklevoss IndexSM (Winkdex)
17
kumarski 9 hours ago 0 replies      
I wish the SEC website was properly responsive.
18
7Figures2Commas 11 hours ago 1 reply      
The Risk Factors section could be tightened up. "Bitcoin lost over half its value in 2014"[1] would probably suffice.

[1] http://www.bloombergview.com/articles/2014-12-23/and-2014s-w...

19
benguild 10 hours ago 4 replies      
I still think it's funny that these guys clearly just went on HN and read about Bitcoin and randomly invested. Good for them though.
18
Junction design in the Netherlands [video]
58 points by bane  13 hours ago   14 comments top 7
1
radicalbyte 6 hours ago 0 replies      
It's not just the roads, it's the whole infrastructure. I live in a modern estate (cira 1990); from here I can cycle to three town centers within 15-20 minutes. Our local shopping center is 5 minutes cycle, or 7-8 minutes by car.

My wife cycles to work, and takes our 15 month old to daycare on her bike. Next year I'll be able to cycle too. Can't wait :)

2
svisser 2 hours ago 0 replies      
If you want to learn more, this video also shows a good foreign perspective: https://www.youtube.com/watch?v=l0GA901oGe4
3
rertrree 7 hours ago 2 replies      
The future of transportation is; separate different modes of transportation as much as possible.

When I was younger, I was regularly biking in between the car lanes( sometimes very risky ), skipping the bicycle paths since they were much slower or simply missing. I would almost always get there before I would with a car or a bus, and parking wasn't a problem. If the lanes were completely separated this could be done safely.

From my experience the roundabouts where the bicycle lane isn't separated, are the least safe for bicyclists, out of any type of intersection. I think the cause is that apart from normal intersections with stop signs or lights, where stopping is the usual procedure, the roundabouts are fluid where drivers stop much less and thus "prefer" not to stop or just think they have the right of way.

4
bradleyboy 3 hours ago 1 reply      
Wonder if cyclists there optimize their route for right turns (a la UPS) since it is so convenient in this setup.
5
bane 5 hours ago 0 replies      
This is how to do it. Clean separation of road vehicles from bikes, none of this painted line in the parking lane nonsense. Engineer for the most vulnerable, give space for everybody.

Even as a driver, this is how I want things to be.

6
mrbman7 4 hours ago 4 replies      
Amusing no one wears helmets.
7
bmsleight 8 hours ago 0 replies      
Quite a good insight. The Dutch design for the motorist last. Hence cycling is prevalent.
19
Eventual Consistency in Concurrent Data Structures
64 points by patrickMc  10 hours ago   2 comments top 2
1
No1 4 hours ago 0 replies      
After a cursory read, I don't buy the guaranteed constant time claim.

"If there are so many competing adjacent deletes that you exceed this number of traversals, you just stop on the 10th deleted node and modify it, editing out the following 10 deleted nodes. The upshot is a list that steadily shrinks..."

...unless you're inserting and deleting more than 10 consecutive nodes all the time. Then the list steadily grows. Though I could be missing something.

2
yshalabi 2 hours ago 0 replies      
Outside of social media (comment lists, facebook feeds, etc) are there any other examples where EC is an appropriate replacement of linearizability?

If you want sequential behavior from your datastructure in the presence of concurrency then, sorry, but linearizability is the only model that meets the requirements. I imagine 99% of programs want their concurrent lists/stacks/queues/maps to behave like their sequential counterparts.

20
Key lessons medical schools dont teach
14 points by timthorn  6 hours ago   discuss
21
The Rise of the 1099 Economy: More Americans Are Becoming Their Own Bosses
40 points by edward  11 hours ago   38 comments top 4
1
jacquesm 7 hours ago 4 replies      
More people starting out for themselves can have several causes, and one of those strongly correlates with unemployment. Not everybody that starts for themselves does so because they think that 'working for the man' is no longer for them, as often as not it is that the man has decided that he has surplus employees and kicks them to the curb. That those people then need income and that many of them are then forced to start their own business or perish is a direct effect of this. If you throw that many people into the meatgrinder some of them will succeed but let's not pretend that those choices were made of free will.
2
jeffreyrogers 6 hours ago 1 reply      
Funny how we went from the 17th/18th century model of self employed craftsmen, to the 19th/20th century model of working for a large manufacturer, and now the trend appears to be back towards self employment.

Of course, as jacquesm noted in his comment, a lot of this is probably due to people being unable to find other, desirable work.

3
xacaxulu 1 hour ago 0 replies      
It's almost as if there is some sort of huge incentive for organizations to not have full time employees or have to pay for their healthcare, vacation days, taxes etc.
4
mark_l_watson 7 hours ago 3 replies      
Except for working at WebMind and Google, I have been a 1099 consultant for about 16 years.

One thing I worry about is the USA government, that seems very keen on extracting taxes from non-rich people, will start plugging up some of the fair tax right offs that 1099 workers get.

The first part of this process, I think, has been the pressure on companies to make people work as W-2 workers. W-2 workers have fewer tax write offs.

22
Readings in conflict-free replicated data types
16 points by deegles  6 hours ago   3 comments top 2
1
ahelwer 1 hour ago 0 replies      
From the first section, I'd personally recommend only reading the two 2011 papers by Shapiro et al. The other three just rehash descriptions of the same TreeDoc data structure, and are only interesting if you want to get some historical context behind the idea (upshot: recent developments were motivated by collaborative text editing). The list also ignores Baquero et al's earlier work on state-based CRDTs (also only relevant if you care about historical context).

Also, I've been working on putting together the CRDT wiki page[0]. I'd love to hear feedback on the talk page!

[0] https://en.wikipedia.org/wiki/Conflict-free_replicated_data_...

2
tsantero 3 hours ago 1 reply      
I doubt the author has read even half of the papers he listed.
23
Let's try to motivate schemes
23 points by jmount  9 hours ago   discuss
24
Why Sweden Has So Few Road Deaths
95 points by sethbannon  15 hours ago   213 comments top 22
1
tomohawk 9 hours ago 2 replies      
I'm acquainted with a civil engineer who worked in the FHWA for several decades.

He spent a big chunk of that time trying to convince government agencies to build roads with pavement that had a better coefficient of friction. See for example

http://safety.fhwa.dot.gov/roadway_dept/pavement/pavement_fr...

He estimated that they could reduce deaths considerably with more appropriate pavement.

In many cases, states refused to even allow pavement friction tests to occur, as they didn't want to be on the hook for paying for improvements if the tests showed poor friction. Also, requiring better friction would disallow certain road materials, which would adversely affect certain very large politically connected contractors.

In Europe, they take this much more seriously. The Germans have started using a concrete pavement that is both better for friction and is quiet.

It's interesting that if a government builds a substandard road that leads to people losing their lives, it doesn't seem to be a big deal, but if a car manufacturer produces a car that leads to a much smaller number of people dying, then it is.

2
dschiptsov 13 hours ago 6 replies      
* Sparse, mostly rural population (with an unique culture).

* "2+1" roads, which discourages meaningless overtaking or competing. (But I don't buy it as the single cause).

* Roundabouts instead of crossroads with traffic lights. (My bet this is one of the most important factors).

* Special reserved lane for doing left turns to secondary roads.

* Lots of street lamps (Sweden is famous for illumination and urban development).

* High percentage of expensive, very good quality cars (lots of Volvos) which kept well-maintained.

* 40 km/h speed limit in most small towns.

Together it works. Basically a complex phenomena is a weighted sum of multiple different causes - cultural (tradition), economical, social (current "normals"), technological, etc. with "random variables". Such simple model could explain it to some extent.

I am in Sweden now, and I drive a classic Volvo.

3
pash 9 hours ago 4 replies      
The article ignores the single biggest reason Americans are likelier than Swedes to die on the roads, which is simply that Americans drive more. The death rate is nearly four times higher per capita in the United States, but only twice as high per mile driven [0].

(In fact, in America, the last few decades' worth of safety improvements to cars have been almost entirely offset by an increase in miles driven; despite their much safer cars, Americans are only slightly less likely to die driving today than they were in 1990 [1].)

The bigger mystery is why Americans are now far likelier to die in their cars than Europeans even after you control for the time they spend in them. That didn't use to be the case. Until the mid-1990s, Sweden (like almost every other country [2]) had a higher rate of traffic fatalities per mile driven than did the United States. Those rates have declined markedly all around the world, but much less in the United States than elsewhere.

My guess is that the divergence comes down to diverging patterns of living. Over the last few decades, American cities have sprawled all over the landscape, so Americans not only drive more, but their road infrastructure is built to facilitate daily travel at high speeds over long distances. A great many Americans drive 30+ highway miles each day between their suburban homes and city offices, but that sort of commute is rare elsewhere.

0. http://en.wikipedia.org/wiki/List_of_countries_by_traffic-re...

1. http://www.planetizen.com/node/68200

2. See, e.g., statistics showing that fatalities per vehicle mile were lower in the United State than in many European countries in 1991, https://www.fhwa.dot.gov/ohim/hs93/Sec7.pdf [PDF; see page 4]

4
_almosnow 1 hour ago 0 replies      
>Building 1,500 kilometres (900 miles) of "2+1" roadswhere each lane of traffic takes turns to use a middle lane for overtakingis reckoned to have saved around 145 lives over the first decade of Vision Zero.

Basically translates to: "this substantial increase on costs (3 lanes instead of 2, +50%) is going to save the lives of 15 people per year". It is really praiseworthy for a country to just go for it and put people first. A human life is truly invaluable but very few countries actually account for them as such; on most other countries "15 dead" would simply just not be worth the investment.

5
netcan 13 hours ago 4 replies      
George Orwell:

.. Accidents happen because on narrow, inadequate roads, full of blind corners and surrounded by dwelling houses, vehicles and pedestrians are moving in all directions at all speeds from three miles an hour to sixty or seventy. If you really want to keep death off the roads, you would have to replan the whole road system in such a way as to make collisions impossible. Think out what this means (it would involve, for example, pulling down and rebuilding the whole of London), and you can see that it is guite beyond the power of any nation at this moment. Short of that you can only take palliative measures, which ultimately boil down to making people more careful.

But the only palliative measure that would make a real difference is a drastic reduction in speed. Cut down the speed limit to twelve miles an hour in all built-up areas, and you would cut out the vast majority of accidents. But this, everyone will assure you, is impossible. Why is it impossible? Well, it would be unbearably irksome. It would mean that every road journey took twice or three times as long as it takes at present. ..

http://orwell.ru/library/articles/As_I_Please/english/eaip_0...

6
arethuza 14 hours ago 1 reply      
Even when you look at deaths per billion vehicle-km the US still seems to be quite bad at 7.6, with the UK at 4.3 and Sweden at 3.7:

http://en.wikipedia.org/wiki/List_of_countries_by_traffic-re...

7
kazinator 4 hours ago 0 replies      
National mentality has to do with it. For instance, you see a night-and-day difference in driver behavior when you cross the border from, say, Switzerland to Italy.

Why poor countries have lots of deadly car accidents probably has something to do with corruption, in part. People won't respect order, such as traffic codes, when their government isn't respectable. If I imagine I'm a citizen of a poor country, why would I stop at a red light if my president gave a contract to some foreign company in exchange for bribes. If he can do as he pleases, so can I.

There is also driver licensing. Here in the provinces of Canada, for instance, if you have a measurable pulse and/or your breath can fog a cold mirror, you can get a driver's license. I can't imagine that the same is true in Sweden.

A learner's permit can be obtained by passing a written test. After that, a very brief and easy road test results in a driver's license. No formal schooling is required; a friend or relative can teach you, for instance.

Furthermore, driver's license renewals do not require any testing. I got my driver's license in 1987. Since, then, I have renewed it every five years simply by paying the fee, signing a paper, and having my picture taken. I've never been required to take any additional training or testing.

Imagine if it was like that for, say, commercial airline pilots.

8
bluedino 10 hours ago 1 reply      
Whenever I see a non-highway fatality here in the US, it falls under a few categories:

* head-on because someone crossed the center line on a 2-way, 40-55mph road, and they are usually going well over the speed limit

* one of the cars was running the light at a 4-way intersection

* someone pulled out into traffic into a 40-55mph road and was hit

9
Retric 14 hours ago 0 replies      
The US saw a huge advertising effort trying to convince people that pedestrian fatalities were not an issue with either cars or drivers. Realistically, while it costs more and be less convenient for drivers a better separation between cars and people is going to save plenty of lives.

However, only 14% of traffic fatalities ~(4,743/year) are pedestrains in the US. ed:(+ 726 bike/year) So there is a limit to how far that can take you. http://www.pedbikeinfo.org/data/factsheet_crash.cfm

10
mytochar 9 hours ago 4 replies      
I found this part particularly interesting, for strange reasons: "now less than 0.25% of drivers tested are over the alcohol limit"

Why are they testing those other 399 people (0.25% = 1 in 400)? Driving while drowsy? drunk driving checkpoints? Someone's a little swervy and gets noticed?

11
01Michael10 14 hours ago 5 replies      
Roads like the ones in Sweden will never be common place in the US. We value speed, convenience, and cheap over a "couple" of dead people.
12
thomasahle 14 hours ago 12 replies      
"Building 1,500 kilometres (900 miles) of "2+1" roadswhere each lane of traffic takes turns to use a middle lane for overtakingis reckoned to have saved around 145 lives over the first decade of Vision Zero."

I'm no expert in roads, but can anyone explain how that shared middle lane is not the recipe for disaster?

13
ptaipale 8 hours ago 1 reply      
Regarding "Vision Zero": I find it unrealistic, for the simple reason that you can't stop people from dying. I think Sweden must share the same way of counting as my country (Finland) where the traffic death statistics include suicides and "natural" deaths.

We have around 250 road deaths per year in a country of 5 million, and about 50 of these are suicides.This means that practically every week there's someone who kills him/herself by an intentional act in traffic. For instance, a couple of months back a mother killed herself and her three children by driving 100 km/h head-on at a bus where his husband was travelling. This was just an act of defiance, desperation and hate.

When suicides count for about one fifth of traffic deaths, it is not insignificant and leaves the number of traffic deaths well above zero. Also, natural deaths - people dying of a heart attack or a stroke while driving - account for a significant part of the statistics (somewhere around 10-15 %). They're counted as traffic deaths even in cases where no one else is hurt and even the vehicle is unscathed.

14
kristofferR 14 hours ago 2 replies      
In addition to the road quality I would say the driver's ed is also a big factor. Getting a license in Sweden is much tougher than most other places, especially the US.
15
xasos 9 hours ago 0 replies      
>Planning has played the biggest part in reducing accidents. Roads in Sweden are built with safety prioritised over speed or convenience. Low urban speed-limits, pedestrian zones and barriers that separate cars from bikes and oncoming traffic have helped.

This is big. Cities in Sweden have the ability to cater much faster to bikers because biking is a much more prevalent way of trasport than in the United States. In fact, that is one of the larger causes for deaths in cities like San Francisco. [1]

[1] http://www.sfgate.com/bayarea/article/Streets-of-S-F-a-road-...

16
ErikHuisman 14 hours ago 1 reply      
Whenever I'm in Sweden i just drive one road. The E4. I bet the asphalt per capita is probably pretty low compared to any other country. But still the quality of the roads don't compare to Denmark or the Netherlands for example which feel even more thought out and structured.
17
protomyth 9 hours ago 0 replies      
I've seen 2+1 in Minnesota (cannot remember which road, but it is a highway off I90 and headed to the cities). In SD and ND, I've seen the added lane for slow traffic to move to so the normal traffic can pass. Same effect, but probably an acknowledgement of the number of slow vehicles found on rural highways (e.g. combines, tractors, heavily loaded grain trucks).
18
pc86 14 hours ago 5 replies      
Can someone with knowledge or experience on the subject explain how the "2+1"[0] road is better than a standard 3-lane highway?

[0] FTA "Building 1,500 kilometres (900 miles) of "2+1" roadswhere each lane of traffic takes turns to use a middle lane for overtakingis reckoned to have saved around 145 lives over the first decade of Vision Zero"

19
finid 9 hours ago 2 replies      
In which city in the US of A do you see bike lanes like that in the article?

In my neck of the woods, riding a bike is a disaster waiting to happen.

20
happyscrappy 14 hours ago 7 replies      
From the comments:

"Sweden has 264 road deaths for 9.5 million people. Britain has 1730 for 62 million people. The deaths rates, at 28 per million people, are virtually identical in both countries and way below every other European country.

Britain though, has impossibly crowded narrow roads laid down hundreds of years ago, full of bends and hazards large urban conurbations with lots of people, including pedestrians, using the roads at the same time, and almost no new roads designed for safety or anything else.

Yet Britain has no more per capita road deaths than Sweden. Perhaps the Swedes have technical solutions in road planning, design and all sorts of legal restrictions on drivers to aid road safety. Perhaps the British are just more considerate and more aware, that, as their roads are much more hazardous, they have to take a lot more care when driving."

Very surprising that Britain has done that well.

21
psp 9 hours ago 0 replies      
Hah few you say, what about Cliff Burton!
22
ZachS 14 hours ago 7 replies      
I'm not convinced the added safety is worth it.

If you had the choice between getting to your destination in 20 minutes, with a 11.4 in 100,000 chance of dying, or arriving in 25 minutes with a 3 in 100,000 chance, which would you choose?

25
The wreck of the Kulluk
48 points by ilamont  14 hours ago   13 comments top 6
1
DonCarlitos 2 hours ago 0 replies      
Craig Matthews, cited and pictured in the article, is my son-in-law's father. He's told this story a few times in my presence and I've always got the chills. What this article doesn't mention is that when he grabbed the floating cable, the size of a man's arm, with the grappling hook, the two vessels were both riding up-and-down on monster waves. So first, they had to synch both up to even make the grab possible. To hear him tell it, it was a totally hair-raising experience. His peers now know him as "The guy who tied THAT KNOT on THAT DAY." It may be the most famous "Bowline" knot ever after that write-up.
2
jonah 6 hours ago 0 replies      
3
keithpeter 7 hours ago 1 reply      
"Between 5:34 a.m. and 11:29 a.m., according to a later computer analysis by Rolls-Royce, the Aiviqs wire tensile strength overload alarm went off 38 times. It was set to trigger at 300 tons. The alarm, a piercing ring, would not stop until Newill acknowledged it on the computer screen. New to Alaska and new to a ship that was new to the world, Newill later claimed that the alarm never went off. Coast Guard investigators concluded that he mistook the tension alarm for another alarm that was known to be acting up."

Dialog box on screen saying which alarm might have been an idea perhaps. Software was obviously logging all the events.

4
pm90 7 hours ago 2 replies      
Extraordinary story. What I always found strange was how super rich corporations like Shell will skimp in making purchases of the required safety mechanisms, try to avoid taxes etc. and will ultimately end up with disasters such as this one, which could have been avoided if the tug had been stronger or better equipped, and the shackles had been new and strong.
6
keithpeter 7 hours ago 0 replies      
http://arcticready.com/classic-kulluk

Took a few seconds before I realised that arcticready.com is a parody produced by Greenpeace. I was previously unaware of this whole saga.

26
Designing Twilight Struggle, the top-ranked board game
38 points by ryan_j_naughton  11 hours ago   23 comments top 7
1
msluyter 7 hours ago 1 reply      
I like Twilight Struggle a great deal, but I find its top BGG ranking somewhat odd.

I'd wager there's a sort of selection effect: the sort of person willing to sit down for a 2 player game in the first place is probably the sort of hard-core gamer who enjoys highly complex brainburners, and thus is more likely to rate it highly.

Compare that to the sort of moderate/casual boardgamer who prefers something along the lines of 7 Wonders. The former will play 7 Wonders and (may) give it a mediocre rating, while the latter aren't even going to try TS. Just a theory.

And for the record, the last time I played, TS took 5 hours, but neither of us were fluent and were thus often consulting the rules and/or stuck in analysis paralysis.

2
ansible 6 hours ago 0 replies      
For those of you who haven't seen the Tabletop series [1] yet, it's a fun way to get a feel for the many shorter board games out there. Previous seasons had shows on Ticket to Ride, Settlers of Catan, and King of Tokyo mentioned in the article.

[1] https://www.youtube.com/playlist?list=PL7atuZxmT954wz47aofSl...

3
jaryd 8 hours ago 2 replies      
As a major Twilight Struggle fan I urge any newbies to check out: http://twilightstrategy.com/!

(I am not affiliated with twilightstrategy.com in any way)

4
zeroonetwothree 7 hours ago 3 replies      
I tried playing this game but I really didn't like it. I found it way too long and complicated. We never actually finished a game. I much prefer <1 hour games.

So sad, since there are so few great two player games. I was really hoping this would be one of them.

5
finnh 7 hours ago 2 replies      
I am dismayed that Connect Four has a below-average rating. Still a great game in my book ... I burned many hours in Thailand playing this on the beach, Beer Chang in hand.
6
CurtHagenlocher 6 hours ago 1 reply      
It's a computer game and not a board game, but I'm surprised that "Balance of Power" wasn't mentioned as a predecessor.
7
michaelochurch 5 hours ago 2 replies      
Two-player games are attractive for a couple of reasons. First, by definition, half the players win. People like winning, and are likely to replay and rate highly a game they think they have a chance to win. Also, with just one opponent, there is little downtime. You dont have to wait while the turn gets passed around the table to three, four or five other players. Thats boring.

I can give some insight into why 2-player games are popular. (ETA: when I wrote this, I missed that 3-player games do slightly better.)

I'm a guy who engineered (optimized?) the fuck out of a 4-player card game (to minimize card-luck in a trick-taking game) called Ambition: https://docs.google.com/document/d/1S7lsZKzHuuhoTb2Wj_L3zrhH...

I succeeded in the design challenge, and I think it's a damn good game, but the game hasn't caught on and player-number (or, for math people, player arity) is almost certainly the biggest culprit. It takes exactly 4. (There's a 3-player variant, but it's not as fun and I'd rather just play a game designed for 3, like Skat.) The reason I bring this up is that I have insight into what 2-player games do so well. It's cultural. A 2-player game (or a 2-party game, like Bridge which has 2 teams of two players) is a showdown and it's decisive. It develops a "mind sport" culture. There's less of a feeling of decisivity in a 3- or 4-player game because you can't always separate "A outplayed B and C" from "B played best but C's mistakes unintentionally advantaged A over B and A won".

Game outcomes generally have four components: Chance, skill, strategic interaction, and flux. Flux is minute-by-minute engagement in the game (which can be modeled as a fluctuation in skill). Skill and flux are what we care about. The "strategic interaction" term might seem odd, because "strategy" and "skill" are often used interchangeably, but it actually refers to the scenario in 3+ party games where A, seeking his own strategic goals, unintentionally advantages B over C. This is sometimes called "strategic luck". (Or, it can devolve into a king-maker scenario. Or it can become an influence of table position, as in Puerto Rico, that some dislike. Or it can be made a part of the game, as in Diplomacy, where people pre-arrange interaction effects-- but might defect.) It's inevitable if you have more than 2 parties. And it makes it easy for people to feel "screwed", just as chance does, so while you can have that element and still have a good game (just as I'd argue that random chance doesn't make a game "bad", even if the Euro aesthetic eschews it) you're not likely to develop the "mind sport" culture of Go, Chess, or Bridge if that's the case.

I don't know for sure if this explains 2-player games getting higher ratings, but 2-player games and variable-player games (like Texas Hold 'em, which accommodates varying player number better than, say, a highly-engineered trick-taking game) do the best job of catching on and developing a reputation. A 2-player game is a showdown and a variarity game accommodates a group of unplanned size; exactly 3, 4, or 5 limits the audience and "catch on" speed.

27
New Solar Power Material Converts 90 Percent of Captured Light into Heat
147 points by lelf  18 hours ago   79 comments top 11
1
davidovitch 13 hours ago 2 replies      
I find it quite annoying that the original papers are not mentioned anywhere in the article. How difficult can it be to include a link/reference/doi to the source? I understand that for many people these scientific articles are too much, and unfortunately most of the times the scientific sources are still behind pay walls, but it is very important if one wants to verify the claims made in the news article.

I believe the following articles are the ones on which this story is based:http://dx.doi.org/10.1016/j.nanoen.2014.06.016http://dx.doi.org/10.1016/j.nanoen.2014.10.018(behind a pay-wall unfortunately)

2
mapt 10 hours ago 4 replies      
So... black latex paint? Asphalt? Water?

http://en.wikipedia.org/wiki/File:Albedo-e_hg.svg

The advance is not that they "reached 90%", that's absolutely trivial, and nothing to do with a "Solar Power Material". They seem to be claiming that their advance is a black material they can paint onto thermal pipes that is durable at 1000K for an extended period of time in Earth's atmosphere, made out of blends of nanoparticles.

"High temperature black paint created that lasts 5-10 years instead of 1 year, reducing maintenance needs for concentrating solar thermal" would be a more honest title.

3
jacquesm 16 hours ago 10 replies      
Solar power is a bit of a misnomer here because most people reading that will assume it implies eventual conversion into electricity or motive power.

Electricity is like steak, heat energy is more like hamburger. It's useful but not nearly as useful as electricity, so you're going to need another conversion step (steam turbines are best at this right now) to get to a more usable form of power and that conversion step will have losses (radiation losses, mechanical losses, electrical losses).

So 'overall' efficiency is the key, not the efficiency of a single step in the process (they should at a minimum then list their current efficiency next to the previously achieved maximum for that step and how cost effective this new method is).

4
jsilence 17 hours ago 1 reply      
Converting light into heat is by far not as difficult as turning it into electricity. So don't mix this up with the 20-24% efficiency that are achievable in photovoltaics.
5
jcr 17 hours ago 3 replies      
phys.org is often just regurgitation of press from original sources, typically university press sites.

Here's a better source URL:

http://www.jacobsschool.ucsd.edu/news/news_releases/release....

6
rndn 13 hours ago 0 replies      
In other words it's a material which has a very low reflectance across a large range of frequencies (i.e. a black material)?
7
Wissmania 17 hours ago 1 reply      
What was the previous best conversion rate? This doesn't get us all the way to electricity, so its not really interesting info without a point of reference.
8
samatman 13 hours ago 0 replies      
It's like, how much more black could this be? and the answer is none. None more black.

http://en.wikipedia.org/wiki/Black_body

9
andy_ppp 17 hours ago 4 replies      
Does this mean that we are closer to replacing fossil fuels? It seems as though for energy we could do it now if we wanted, yet we still obsess about fracking and to a lesser extent middle eastern oil? The cost difference is certainly not the reason we don't dive in and make these changes.

Those in charge still seem to be of a 'secure the oil and the heroin' and you rule the world.

Oh dear -> https://www.youtube.com/watch?v=xW3XeT7qavo

10
minthd 15 hours ago 1 reply      
We already have materials that can achieve 75% efficiency [1] and probably more(it's just a rough search), so the 90% figure is much less impressive . But they also claim low cost and low maintenance over other methods, which are highly valuable.

[1]http://www.mit.edu/~soljacic/cermet_solar-thermal_OE.pdf

11
diltonm 13 hours ago 0 replies      
Checking out some of the Fresnel videos on Youtube really helps one appreciate the power of the Sun. If the new material can withstand 700 C then point a Fresnel at it and hook up the output to a Stirling engine for conversion to electricity.
28
Laws Restricting Tech Actually Expose Us to Greater Harm
48 points by bootload  2 hours ago   8 comments top 3
1
navait 1 hour ago 2 replies      
There is no war on general purpose computers. If you want one, you can get one.

Some people need one, some people don't. What we are finding is that very few people wanted a general purpose computer in the first place. They wanted to do a thing, and a general purpose computers did that thing(also many other things they didn't want to do or understood).

You could buy a dvd player that was region unlocked. You can purchase unlocked phones. You don't have to buy an iphone, or stick to the Google play store but it's nice to know that you could live in the garden if you wanted to.

Is the NSA a problem? absolutely. Could requirements like cell-phone termination be used for nefarious means? yes. But we built strong democratic institutions for a reason, and we should be turning to them, not some ideology that helps only the technologically gifted. Mr. Doctrow needs to argue against policy on it's individual merits, not claim it violates some FSF ideology he loves and fearmonger luddites.

2
AndrewKemendo 1 hour ago 0 replies      
I hate to say it but I think there is no way to prevent this.

If the human experience shows us anything it is that those with power will use any instrument to exert its will. The internet and computing generally was not born from anarchy and does not exist free from powerful influence from the start.

There is no man-made system that is not corrupted by powerful interests.

3
dang 1 hour ago 1 reply      
Url changed from http://boingboing.net/2014/12/26/war-on-general-purpose-comp..., which points to this.

Discussion of related talk from yesterday: https://news.ycombinator.com/item?id=8805039.

29
Show HN: Emacs image editing via imgix
24 points by jacktasia  8 hours ago   4 comments top 2
1
coding4all 5 hours ago 0 replies      
This is great! I can barely remember my editor/IDE days before Emacs. I mean there was nano, pico, vi, vim, Gedit, and on to Eclipse, but Emacs was the only one that allowed me to work the way that I think.
2
teddyh 4 hours ago 2 replies      
This is great and all, but its yet another Emacs mode for a Software-as-a-service.

Viewed purely as an image manipulation tool, I have real trouble seeing a good reason why this couldnt be done locally using Cairo. Viewed as an Imgix parameter previewer, I dont see why you wouldnt simply use Imgix directly via a web browser.

30
Show HN: Guide to Elite College Admissions and SAT Prep
14 points by BlackJack  6 hours ago   7 comments top 5
1
peter_l_downs 1 hour ago 1 reply      
First, I think it's awesome that someone put this together to try to help kids get into better colleges. That's great, and it most likely took a lot of time, and it seems like a project done entirely out of good will. That's awesome. That said, I had a really strong negative reaction after reading the introduction. I'm a college freshman, although since I spent some time working after highschool it's been about 2 years since I applied to colleges. I was aware of College Confidential but it seemed like a shitty site full of people pretending that they knew more than me about the college process. Lots of comments like "no way you could get in there with those scores." As far as I can tell, here's all you need to do to improve your SAT: take lots of practice tests. And I'm not even sure that's the best way to improve your chances of getting into a selective school. If you ask colleges how to improve your chances of getting in, they'll tell you to become more interesting and well-rounded. I don't buy the theory that colleges lie about this out of some ulterior motive I just don't see the incentive. I ended up getting extremely lucky and got into a great school, but the only thing I can think of that would have made me "stand out" in an applications process is being relatively good at programming.

Beyond a certain point of classes / SAT scores, you're going to be much better served doing something productive and interesting with your time than endlessly taking practice tests and posting on a forum full of people who don't know what they're doing. The only people who really know whether or not you're going to get into a college are its admission officers.

2
therobot24 4 hours ago 0 replies      
nice site, but a lot of the information is very vague:

[Essays]> Read a lot of college essay books. See whats cliche and whats not. Absolutely avoid cliche topics.

Such as.....?

[Recommendations]> These can be crucial, but usually arent.

So they are but they aren't...give an example of when they are, and when they aren't.

However, you do have some good information that you don't emphasize enough:

[Extracurriculars]> The truth is that most people, high school, college, or otherwise, are not really passionate about anything. The goal of your extracurriculars should be to explore things that you may be interested in. The more unique your ECs are, the better. Don't do math club just because you saw someone online doing it.

[Overview]> Being the smartest in your high school doesnt guarantee you anything.

Also if you're going to use references, e.g. in Essays: "Some good books are [1,2,3]", make [1,2,3] links to said sources.

3
BlackJack 6 hours ago 0 replies      
Hey all,

I recently finished this website. It aggregates a lot of material from College Confidential and other sources and just tries to present it in a nice manner. The goal is to explain how top college admission works and most of the rest is on SAT preparation.

I used Pelican + Sphinx for all of it. Would love to get your feedback.

4
graeme 3 hours ago 1 reply      
Interesting site. You know the SAT will be redesigned in a year? I tutor the SAT occasionally, and thought about making a site, but the pending change put me off developing anything major.
5
akhilcacharya 4 hours ago 0 replies      
Dang, if only I had this 2-3 years ago...
       cached 1 January 2015 05:02:03 GMT