hacker news with inline top comments    .. more ..    3 Mar 2014 News
home   ask   best   4 years ago   
1
Police hid use of cell phone tracking device from judge because of NDA arstechnica.com
63 points by rajbala  35 minutes ago   5 comments top 2
1
derekp7 4 minutes ago 1 reply      
So here's the problem that I have with most of these types of cases, where police don't follow proper procedure to capture a suspect. Let's say they went through normal procedures -- get a sketch artist to make a possibly inaccurate caricature of the suspect, and use that to make an arrest, and have the victim ID the suspect. This has a good chance of convicting the wrong person. First, if the sketch is inaccurate, the victim may have that image in there mind afterwards, which would influence picking out the suspect from a lineup.

Now, using the alternative procedure, in which the police are tracking down the victim's stolen phone (I'm assuming with the victim's blessing), there is a much higher chance of getting the right suspect.

So, how do you argue with someone that the police should use inferior (but legal) evidence to put the (possibly wrong) person away, instead of using more accurate (but 4'th amendment violating) evidence to get the right person? Now in cases where proper procedure always gives superior evidence, then it is easier to argue for that (such as standards for collecting DNA evidence to avoid contamination). But in other cases, it is possible that unauthorized methods can lead police away from an innocent person and toward the guilty one.

2
csense 9 minutes ago 2 replies      
The police department has Constitutional obligation to follow the Fourth and Fifth Amendments. Don't these obligations trump the provisions of any private contract?
2
Goodbye Academia anothersb.blogspot.co.uk
149 points by andyjohnson0  3 hours ago   90 comments top 24
1
WestCoastJustin 1 hour ago 0 replies      
It's worth noting that this guy is running a kickstarter campaign re: "A free, up-to-date, crowdsourced protocol repository for the life sciences". If we are sending tons of traffic his way, maybe we should send some to his kickstarter too.

https://www.kickstarter.com/projects/1881346585/protocolsio-...

2
naterator 2 hours ago 4 replies      
These kinds of things are daily discussions in laboratories all over the world. Everyone knows there's no future in academia, and so everyone is looking for an exit plan. I actively discourage people from doing a PhD. Academia is a horrible feudalistic system that doesn't pay well enough to keep the bright minds it attracts. There is no sense in being a post-doc for life. Much better to take that hedge fund job.

The really concerning potential consequence of this is that it could result in a dearth of innovation in cures for diseases, and we won't see the affect of losses until 5 or 10 years down the road. No one will fund a biotech startup that's not backed by MDs or PhDs and academia-approved proof-of-concept results. Good luck getting that when everyone is running for the lifeboats.

3
eykanal 2 hours ago 1 reply      
This is almost exactly my story as well.

One point he doesn't mention is that there are many interesting problems in the "real world". Lots of academics just point to industry in general and say, "no freedom, no thanks." I've held a few jobs and all of them presented with unique, interesting, and challenging problems. I've had the freedom to choose my own approach to solving problems and met up with other academic-minded people to have good lab-meeting style discussions about how to tackle a project. Industry positions can be pretty attractive.

4
plg 1 hour ago 3 replies      
Yet another worrying sign is that at my university (and I've heard similar stories from colleagues at others) is now deciding that instead of hiring 2 or 3 junior faculty positions (new academics straight out of a postdoc into their first faculty job), instead the priority is to spend the same money on one mid-career "poach" from a competing institution. Double the salary, and bigger startup package. Their rationale is that mid-career scientists will bring larger and more research grants (and will do so faster after arriving) than junior scientists. Essentially, let someone else take on the "risk" of the new faculty members and we poach the proven ones.

It's a jungle out there people. If you care about salary and upward mobility for god's sake don't go into academia.

PS I am a full professor at a large research oriented university in north america. Most of my contemporaries from high school and undergrad who have spent similar numbers of years amassing expertise in their chosen fields, but in the private sector, are now making approximately 5x to 7x my annual salary (not including their annual bonuses).

5
dnautics 1 hour ago 1 reply      
I quit my second postdoc to launch a nonprofit research institute (we did not get funding, and are retrying later this year). I currently drive for lyft - and make more money than I did as a postdoc, with far fewer hours and better working conditions. This makes economic sense; it's not clear to me that what I was doing as a scientist was really doing society any good, at least as a late night driver I'm 1) giving people what they want and 2) keeping drunk people off the streets (a social good).

My Academic path has been tortured; graduated from a really good undergrad, went to an even better grad school (my cohort is basically placed as faculty at places like Berkeley, Stanford, UCSD, TSRI, etc). But in grad school I lost time cleaning up after an irresponsible grad student (who, btw, is faculty at UW) and only published two papers that aren't flashy but are solid, and in second-tier journals. Did an amazing first postdoc actually possibly helping the world (pushing forward a drug candidate), at a third-tier school - since due to the economic collapse, was hard to get a job/good position in 2009. One publication, second-tier journal. Did a more amazing second "postdoc" (actually hired as a BS biologist, via craigslist) under a nobel laureate, at an institution where publishing isn't a priority, and the resources available are somewhat orthogonal to doing the comprehensive set of experiments necessary to get a cell/nature/science paper. My efforts resulted in improving an enzyme - three times (there are very few people who can claim to have done that even once), again, second-tier journals (two are papers-in-work, even though I've quit, i'm still going over there to get them written up). I'm not really ever going to get a faculty position (tried, two years running). I see crappier postdocs and grad students get their run, but you know what? I don't care anymore.

6
k2enemy 2 hours ago 0 replies      
It isn't just that research funds are drying up. There's also a growing number of Ph.D.s fighting over the shrinking pot of money [0]. I usually advise students not to pursue a Ph.D. because the life of the median academic is pretty awful. But at the same time the general job market for new grads (in most fields) isn't all that great either, meaning that in relative terms the Ph.D. route hasn't fallen too far in the rankings of post-graduate plans.

[0] http://www.nsf.gov/statistics/sed/2012/data_table.cfm

7
Balgair 14 minutes ago 1 reply      
My brother is leaving industry and going into the ivory tower later this year. He says that it's not just the DoD or the academy, its all government funding. He says the PhD is like a union card now. Lockheed just fired 4000 people[0] and NASA's average age is about 53 [1]. The entire DoD is aging and about to retire, with no-one in the 25-50 year old range, effectively. You'd think that they would then start hiring people in those age brackets, but no. It seems that the idea is to just let government funding die a slow death, that or transition to drones somehow. Its increasingly likely that the good jobs are going to be from private funding due to decreased tax revenue. We can see this in the Valley right now. This means you gotta know people to get the work, not just be 'good.'

[0]http://www.aviationweek.com/Article.aspx?id=/article-xml/awx...[1]https://wicn.nssc.nasa.gov/c10/cgi-bin/cognosisapi.dll?b_act...

8
6cxs2hd6 2 hours ago 6 replies      
Not to sound like some crazy person with the word SOCIALIST written on the inside of my forehead in neon colors, but:

It should be possible for WhatsApp to get "only" $18 billion and scientific research gets the remaining $1 billion. About a 5% tax.

That way, venture capitalists could claim to fund innovation and would actually be truth-tellers.

9
gjuggler 2 hours ago 3 replies      
This is a really thoughtful post highlighting many of the deeply-rooted problems in securing funding as an early-stage academic. It's depressing for bright young scientists to be looking forward to lives as assistant professors submitting grant after grant with an expected ~10% success rate.

But what surprised me most was that at the end of the essay, after having described his fear of facing such uncertainty in NIH funding, the author mentions that he left academia to co-found a startup making software for life scientists.

Wait a minute don't small software startups have equally poor success rates? (e.g. http://www.quora.com/What-is-the-truth-behind-9-out-of-10-st...)

If uncertainty of success was his major concern, hasn't the author chosen a pretty poor next step in life?

10
zenbowman 2 hours ago 1 reply      
On the one hand, it is sad that good, forward-looking academics are being denied funding.

On the other hand, during my time as a PhD student doing research for a top tier university, I saw quite a few projects where I was disturbed by the fact that we were contributing any taxpayer dollars to the project at all.

I think the existing academic model is unfair to both professors and especially to undergraduate student. Allowing the very top professors to focus on research, while making the rest take teaching seriously could remedy the situation.

11
ChristianMarks 30 minutes ago 0 replies      
I've left academia and returned, then left and am returning again. Thanks to the sequester, the NIH and NSF funding situation is bleak--I know successful PIs who currently have zilch. They're pursuing consulting contracts to make ends meet--at least in engineering this is possible. At this point it is trolling--pure sadism--to suggest that confidence and hard work will overcome the destructively competitive working conditions many academics look forward to every day. (I suppose I could be confident, for an additional charge.)

I happen to be returning after leaving a non-academic job a few months ago. I spent the intervening months working an application with a friend. Over the past decade we have been attempting to solve a certain problem for ourselves. After dead end upon dead end, we have a prototype. Now, on the verge of re-joining the academic precariat, a potential customer has asked us how much we would license our software. We'll see how that goes.

The probability of my landing a tenure-track position anywhere is less than the probability that the software venture succeeds (perhaps this isn't surprising, judging from my posts online here). One tires of playing zero-sum games for diminishing payoffs against people who should be your collaborators. This is the kind of the cost-benefit analysis one doesn't do explicitly that seems to underlie decisions to leave. (I am rational, according to a cost-benefit analysis I haven't done.)

12
001sky 1 hour ago 1 reply      
However, one aspect of being a professor has been terrifying me for over five years now the uncertainty of getting funding from NIH. No let me rephrase that. What is terrifying is the near-certainty that any grant I submit would be rejected. I have been waiting for the funding situation to improve, but it seems to only be getting worse. I personally know about ten scientists who have become professors in the last 3-4 years. Not a single one of them has been able to get a grant proposal funded; just rejection, after rejection, after rejection. One of these is a brilliant young professor who has applied for grants thirteen times and has been rejected consistently, despite glowing reviews and high marks for innovation. She is on the brink of losing her lab as her startup funds are running out and the prospect of this has literally led to sleepless nights and the need for sleeping pills. How can this not terrify me?

Why does MIT require funding from the NIH? Isn't this what endowmnets and tuition is for? Imaginge of google or GE hired people and forced them to raise money from the federal government to actually build their next project? Notwithstanding the mis-appropriation of the profits, purely from a managerial perspective this is highly flawed.

The flipside is also true, Universities are sturctured to leverage outside capital rather than their own (despite having gobs of it). MIT has $11B in the bank, they are not desperate for cash. To do "science", or otherwise.

13
eranki 1 hour ago 0 replies      
Trying to become a successful academic seems about as insane as trying to become a rock star.

Here's Peter Higgs on the subject of how academics today compares to the past: http://www.theguardian.com/science/2013/dec/06/peter-higgs-b...

Most people I know who went down the academic route have left the world or are seriously thinking about leaving. I know a couple people with positions at top universities, and around the age of 30 their careers are just starting, with tenure being potentially a coin flip.

And it always traps the most brilliant people. That's the worst part.

14
singingfish 2 hours ago 0 replies      
It's not just the US, it's internationally. Last time I was interviewed for an academic position the interview feedback I got made no sense whatsoever. A different (ivy league) position doing some really interesting work fell through for bureaucratic reasons around the same time. It was around that time that I FYIQ'ed myself.

If you're technically minded, open source software developer provides many of the good things about academic work without many of the downsides.

15
Create 2 hours ago 0 replies      
"How should we make it attractive for them [young people] to spend 5,6,7 years in our field, be satisfied, learn about excitement, but finally be qualified to find other possibilities?" -- H. Schopper

Indeed, even while giving complete satisfaction, they have no forward vision about the possibility of pursuing a career at CERN.

This lack of an element of social responsibility in the contract policy is unacceptable. Rather than serve as a cushion of laziness for supervisors, who often have only a limited and utilitarian view when defining the opening of an IC post, the contract policy must ensure the inclusion of an element of social justice, which is cruelly absent today.

http://staff-association.web.cern.ch/content/unsatisfactory-...

In my three years of operation, I have unfortunately witnessed cases where CERN duties and educational training became contradictory and even conflicting. This has particularly been the case when the requirements of the CERN supervisor conflict with the expected time dedicated to a doctoral students thesis.

http://cds.cern.ch/journal/CERNBulletin/2013/27/News%20Artic...

16
slamdesu 1 hour ago 0 replies      
Many of the recent 'goodbye acadaemia' that I've come across seem to be from people in molecular biology, which has a reputation for being a particularly competitive field for funding. I wonder whether academics in other fields are dropping out at a similar rate..?
17
curveship 1 hour ago 1 reply      
My own theory is that it will take academia another generation to figure out what it has done to itself. With entry-level conditions becoming more and more terrible, the best students will increasingly seek careers outside the academy, and academia will become populated by the second tier. At that point, it will have lost a prestige which will take another century to repair.
18
it_learnses 2 hours ago 1 reply      
It is indeed sad what's going on in the U.S. with regards to Research funding. Maybe you can go directly to the public who sympathises, for example maybe a kickstarter of sorts for researchers?
19
DanAndersen 1 hour ago 0 replies      
I'm curious about what areas of academia are more or less affected by these trends. I'm planning on starting a PhD program this year in computer science (emphasis on graphics research), and I had been under the impression that the state of funding was not so dire in CS as it is in the other sciences (the sciences that are more science than engineering), but I'd certainly be interested in any perspectives about academia in my field in particular.
20
newyorklenny 2 hours ago 0 replies      
Indeed, I left academia for a startup. Securing funding for ZappyLab is by no means easy (http://anothersb.blogspot.com/2014/03/hello-startup-sequel-t...) But as hard as it is, there are many VCs, angels, and there is crowd funding (we are running a Kickstarter campaign now). Certainly not easy, but there are more options. You run a genetics lab and lose your NIH grant - where do you go?
21
fonnesbeck 43 minutes ago 1 reply      
Apparently, its not possible to leave academia unless you write a detailed blog post about it.
22
academocrat 1 hour ago 1 reply      
Can someone post some numbers please?

How many people are getting doctorates (probably increasing)?

How many faculty positions are there over time (probably increasing much less)?

How much money per researcher is out there (probably shrinking RAPIDLY)?

How can we measure if we are getting more lenient with giving out PhD's? Shouldn't only the cream of the crop get to do research, not just the ones who claim to love it.

23
smartiq 1 hour ago 0 replies      
not sure why such a mundane story got such a huge outpouring
24
blt 1 hour ago 0 replies      
Some people can't produce any output when sleep deprived.
3
All RGB colors in one image joco.name
187 points by seanalltogether  5 hours ago   37 comments top 13
1
GuiA 4 hours ago 0 replies      
The SE thread linked in the post is really worth the read: http://codegolf.stackexchange.com/questions/22144/images-wit...
2
devindotcom 4 hours ago 3 replies      
There's actually a community of all-RGB pic creators:http://allrgb.com/
3
nwh 4 hours ago 1 reply      
Site is 508'ing. Here's a partial mirror.

http://archive.is/nZwxl

Embedded youtube video:

https://www.youtube.com/watch?v=OuvFsB4SLhA

4
blhack 3 hours ago 1 reply      
Server seems unavailable :(

There was a challenge a friend of mine and I had a while back: take an image, and represent the entire RGB colorspace within that image, without reusing any colors.

It was a REALLY fun challenge, and I encourage everybody to try it as well.

The way that I ended up winning (ha) was to represent the RGB colorspace as a 3D array, and then unravel that 3D array into a 1D skip-list.

The script read the pixel value at (n,n) of the image, "decide" where this would exist if RGB was unraveled into 1D, go that that spot, and then either set the color in the "new" image (if that color was unused) the value in the array, or read the "skip" destination: the place where the closest unused color was located.

I believe this was the site that inspired us: http://allrgb.com

5
headShrinker 2 hours ago 0 replies      
Some of these images in their most basic form are strikingly beautiful. It reminds me of some algorithmic art Joshua Davis produces. ( http://www.joshuadavis.com )

The OP might consider a more art based context with a potential for high priced prints and gallery showings.

6
80 3 hours ago 0 replies      
While the site's down, here's all the RGB colors in one book:

http://i.imgur.com/kae66HY.jpg

('RGB Colorspace Atlas' by Tauba Auerbach)

7
leberwurstsaft 25 minutes ago 0 replies      
There's this iOS app that generates very similar images, but with some interactivity.https://itunes.apple.com/app/rgb-petri/id423126001?mt=8
8
hk__2 2 hours ago 2 replies      
Im wondering how much of the visible color space is represented here. 100%? 80% 50%?
9
jonomw 2 hours ago 0 replies      
Server seems to be down but found the Google cached version: http://webcache.googleusercontent.com/search?q=cache:ghZGR0R...
10
lelandbatey 3 hours ago 1 reply      
Man, that video was really cool! I wish the author had released the source code for this though, I'd love to know how this was made.
11
ars 4 hours ago 1 reply      
Such an image is smaller than you might think. Uncompressed it's 50.33MB.

It's time to move to 48 bit images.

12
recursify 4 hours ago 0 replies      
I'm getting a 508 error...

But somewhat related, and I think was on HN at some point: http://corte.si/%2Fposts/code/hilbert/portrait/index.html

13
gabipurcaru 4 hours ago 1 reply      
It's very interesting how our eyes and brain processes colors differently -- for example, I see a lot of green and blue, but less red (though this is probably due to having a mild case of Protanomaly I suppose).
4
FSF joins forces to fight software patents in U.S. Supreme Court fsf.org
84 points by conductor  3 hours ago   10 comments top 3
1
jfasi 1 hour ago 1 reply      
In anticipation of a slew of "this is so obvious" comments that quote from the brief, I urge you to remember that there's an opposing side to this argument: no matter how compelling or obvious this brief's arguments may seem on their own, the court will be weighing them against the arguments of the opposing side.

Without seeing those arguments, we can't really make any substantive judgements about this brief.

2
kodablah 39 minutes ago 1 reply      
I have read the brief. It basically says to constantly apply the "machine or transformation" test to a patent to determine if it's standalone software or something substantive. It also mentions how some software doesn't rely on a specific type of machine to execute.

How is the PTO expected to be able to determine the difference between a "standalone software" patent from "software as part of a machine/transformation" patent? And if they can't and the law is only applied in retrospect (e.g. patent validity challenges), how does this abstract "machine or transformation" concept prevent litigation which seems primarily driven by intimidation not expectation of victory.

3
rqebmm 2 hours ago 2 replies      
Can someone explain to me exactly what it would mean if software ideas were no longer patentable?
5
Beets - cli music manager and auto tagger github.com
16 points by dewey  29 minutes ago   2 comments top 2
1
bjackman 0 minutes ago 0 replies      
Cool - I'll set this up tomorrow, it looks good and my Music library is a mess.

It would be nice to have a centralised repository on the website where people can share plugins (like Sublime's Package Control, which is 3rd party). Programmable is great, but it would be a shame if everyone had their own half-arsed solutions to common problems.

2
fournm 16 minutes ago 0 replies      
Ooh, this looks exactly like what I was looking for like ... a year and a half ago to go along with dmpc.
6
Meetup site down after hacker sought to extort $300 from CEO cso.com.au
17 points by drewjaja  36 minutes ago   discuss
7
Help me distribute $100,000 to new entrepreneurs in Africa paulbuchheit.blogspot.com
148 points by paul  5 hours ago   65 comments top 17
1
rdl 3 hours ago 1 reply      
I love the idea, but I have one concern, which may or may not be real.

I think 5% is below market rate for what you are doing. By establishing 5% as the rate, you crowd out any local investment options which would need to charge above 5% (but not 60%). By making the subsidy non explicit, you make it even harder for local lenders to compete.

I don't know what a market rate for this kind of loan would be, but in the US, it would be more like 10%. Some kind of explicit discount due to social benefit or something like that, or marketing to launch a new business, or whatever, could make sense, but I really don't want to see the same situation where USAIS dumps free or subsidized goods and services and crowds out the local producer.

At the same time, sacrificing the (not yet extant) p2p lending sector in Kenya might be ok if it helps enough other businesses -- similarly communication and security being subsidized might be a net good.

2
nanijoe 4 hours ago 3 replies      
I find the effort here commendable, but it will probably not take a very long time for these guys to realize that "Africa is not a country" . I'm Nigerian, and I have been to a few African countries.First of all the challenges you have to overcome while doing business in Nigeria alone, varies wildly from one City to another. Now comparing that to East Africa for example, the difference is night and day.It may serve these guys well to begin their venture in one African country they find most familiar / friendly, then expand their efforts from there.
3
jkurnia 5 hours ago 6 replies      
Dear all - I'm Julia, the founder of Zidisha. I'd welcome comments and questions from anyone who would like to learn more about our P2P microlending platform.

If you'd like to hear more about my own journey and the experiences that lead to the founding of Zidisha, see http://www.huffingtonpost.com/julia-kurnia/why-i-founded-zid...

4
OoTheNigerian 4 hours ago 1 reply      
Hi Paul,

First of all, I would like to commend your worthy contribution to the good work the people at Zidisha are doing.

There is something I want to draw to your attention to(searched and did not find your email)

I am wondering how to balance the stereotyping to the "African Entrepreneur" while at the same time not forgetting the millions of small scale business owners in Africa.

When I first saw the headline, I was excited that finally, a top tech investor would be investing in African startups. Only to look and see yet another (VERY COMMENDABLE)but "charity like" donation.

In general, you do not see VCs from the west investing in African startups. That door is almost always closed. When they do it has to be under the category of impact investment and/or a "social enterprise"

Why do you think this is so?

Two years before the WhatApp acquisition, a Ghanaian startup SAYA https://www.saya.im/ came for disrupt but they could not get any funding. Even with hundred thousands waiting signups. Almost like they were not worthy to tackle high tech.

I am very certain that if these guys had built an SMS app to remind farmers of prices in the market (or something along that line) they would have got funding.

Thankfully, things are changing a bit as Dropifi http://www.dropifi.com/ from Ghana) got funding from 500 Startups.

PS: I want to make it clear that what you are doing is very worthy and commendable and this is not meant to be a criticism of you or this action in ANY way.

PPS: I will be in SF from Wed-Sunday this week and would be very happy to chat with people who want to learn about the tech scene in Nigeria (yes and that includes 'Nigerian scams' et al :)).

Oh. I also do not mind surfing on your couch if available too :). You will be paid in Nigerian beer (I brought some).

my email is in my profile.

5
wehadfun 28 minutes ago 0 replies      
Why not just give the entrepreneurs the $200?

The money they would be paying back they could use to build up their business.

I doubt anyone would do this investment for a return.

6
lukasm 2 hours ago 1 reply      
Loans are way much better form of helping, but investing with a partnership and advisors is a better option. Money is the key ingredient, but the some ventures do require lot of knowledge. Are loans preferred due to regulatory issues? How do you overcome "This is just money from rich western guys to dump the guilt" mentality? Are you going to build alumni network in similar vein to YC?
7
dchs 5 hours ago 0 replies      
Neat idea. I love that the repayments are going back in to funding Zidisha itself - a unique way to raise ~$100k! Congrats on the funding round Julia :
8
newfund 3 hours ago 0 replies      
Note: Kiva has a program called Kiva Zip where you can make direct loans to entrepreneurs in Kenya at 0% interest: https://zip.kiva.org/
9
netcan 4 hours ago 0 replies      
I like the idea of p2p financial services. However everything development related is complicated qnd the stories describing projects are often romanticised.

There was a great podcast recorded during the initial microfinance boom. One surprising conjecture is that most small like ans are for consumption, not seed capital and that this is rational.

http://www.econtalk.org/archives/2011/04/munger_on_micro.htm...

10
bliti 3 hours ago 0 replies      
This is a fantastic thing to do. I do have one question. Do you plan to do the same thing (maybe in a smaller scale) in the US? Say, help people in some impoverished area of the country with some small funding to help them get micro businesses going. Not arguing against the choice to do so in Africa (its great), just interested if there would be a push to do it "locally".
11
phantom_oracle 3 hours ago 1 reply      
Interest is the worst thing you can use in a continent like Africa.

Why didn't you guys opt for the profit-sharing philosophy?

Contract laws in some countries can turn some entrepreneurs insolvent because of interest, but profit-sharing won't have such a devastating impact.

I'd help, but not under compound interest restrictions.

12
michaelmior 4 hours ago 1 reply      
Just a note that the default tweet is actually too long and needs to be manually edited.
13
brandonb 4 hours ago 0 replies      
How incredibly cool!
14
bayesianhorse 4 hours ago 0 replies      
You normally see requests for help distributing money the other way around...
15
coltr 2 hours ago 0 replies      
Wow this is awesome. Love the charity:water esque updates idea.
16
ghouse 4 hours ago 1 reply      
"Distribute" or "Invest"?
17
nsxwolf 5 hours ago 0 replies      
Sure! I'll make sure it gets to them. Where can I pick up the money?
8
Arq 4 is out Mac backup to S3/Glacier haystacksoftware.com
47 points by michaelx  2 hours ago   28 comments top 12
1
chimeracoder 1 hour ago 1 reply      
Pretty awesome to see this. When I used a Mac for work, I used Arq and set it up on my coworkers' computers (they were completely non-technical). It was very easy to use.

I'm curious what backup tools people use on Linux if they want to back up files on Glacier. I use git-annex[0] for certain files (it works well for pictures and media). The rest of my backup process is a fairly rudimentary (though effective) rsync script, but it doesn't use Glacier.

My current setup works fine for me, but I imagine there are better tools out there.

[0] https://git-annex.branchable.com/

2
willtheperson 7 minutes ago 0 replies      
Can someone tell me why I shouldn't be using BitTorrent Sync as a multi-location backup plan?

In other words, can someone sell me on the idea of paying for AWS storage when I have dirt cheap storage around my house and even a remote location that I can stuff a huge drive in.

3
tunesmith 54 minutes ago 2 replies      
I've regularly been curious about this, but Crashplan has a stable reputation and seems much less expensive for large backups. For those who have researched Crashplan, why did you choose Arq instead?
4
dewey 1 hour ago 1 reply      
5
rglover 51 minutes ago 0 replies      
Quick endorsement: buy this. Arq has allowed me to be insanely careless about backups and has never failed. Makes the whole backup thing cost effective, too with Amazon being dirt cheap these days.
6
DomBlack 8 minutes ago 0 replies      
How does this compare to say using Tarsnap (apart from cost)?
7
leejoramo 2 hours ago 1 reply      
Arq has been part of my backup systems since version 2. This upgrade looks very good. Especially the expansion beyond relying solely on Amazon S3 and use number of other services including SSH.

Having used the past Glacier support, the new S3 Glacier Lifecycle will be much better.

I am wondering about when (if?) the open source arq_restore and format documentation will be updated.

8
Goopplesoft 1 hour ago 2 replies      
I was looking at Arq the other day and couldn't find if it had a bootable backup feature (like CC Cloner). Anyone know something about this?
9
AdamGibbins 1 hour ago 0 replies      
Wonderful piece of software - highly recommended. Awesome upgrade also, have been waiting for S3 alternatives for a long time.

It's unfortunate a few things appear to be backwards - why can you include wifi APs, yet not exclude them - despite the example suggesting you exclude from tethered devices.

Likewise, why can you email on success, but not failure.

10
michaelx 1 hour ago 0 replies      
I'm using Arq since 2011 to backup my most important data to Amazon S3+Glacier and can highly recommend it.

Today v4 has been released and comes with new storage options (GreenQloud, DreamObjects, Google Cloud Storage, SFTP aka your own server), multiple backup targets, unified budget across S3 and S3/Glacier, Email notifications and many more clever features.

11
xxdesmus 1 hour ago 0 replies      
Big fan, the inclusion of DreamHost's DreamObjects is a huge improvement also. In most cases they are cheaper than Amazon's S3 storage. Still more expensive than Glacier...but DreamObjects doesn't have any of the crazy slow retrieval times or costs that Glacier does.
12
coffeecheque 1 hour ago 2 replies      
I love the addition of SFTP, and I hope to buy the update soon.

Can anyone recommend a SFTP backup provider?

My Arq backups are designed to be worst-case. I have other, local backup options in case of failure. I was using Glacier, but I ran into Arq3 sync problems and I need to re-upload all my data. Glacier is very slow from where I live. I assume SFTP will be a bit faster.

9
Cracking Linksys Encryption devttys0.com
16 points by amboar  55 minutes ago   4 comments top
1
georgemcbay 18 minutes ago 2 replies      
The more things change, the more they remain the same:

Back in 1995 I "cracked" Cisco's router password "encryption":

https://groups.google.com/forum/#!original/comp.dcom.sys.cis...

Strikingly similar 'security', which is extra funny as Linksys is owned by Cisco. (EDIT: Not anymore! Thanks for the correction)

Back then net admins would regularly post their configuration files (with 'encrypted' passwords left intact in most cases) to usenet to get help/tips on how to better configure their routers, which was an unaddressed (by Cisco) security nightmare.

10
Rails The Missing Parts joingrouper.com
109 points by tomblomfield  4 hours ago   81 comments top 16
1
dhh 3 hours ago 2 replies      
The proof is always in the pudding. While there are good and reasonable times to introduce "interactors", this particular example is poor. The tests presented are anemic, and the code is absolutely not any clearer by being extracted and wrapped. I would indeed have kept all this in the controller.

The key point for an "interactor" extraction is imo when you have multiple models being created in symphony, like a Signup model. Or if you for some reason need to reuse the behavior.

But if all your controllers look like this, with one "interactor" model per action, you're doing it wrong.

Whatever floats your boat, though. If this is what you prefer, great. But please hold the "beginner's version" crap. Plenty of large apps are built with vanilla Rails. Basecamp is one.

2
programminggeek 1 hour ago 0 replies      
I've spent more time thinking about clean architecture and design than most and the conclusion that I've come to is while Ruby gives you all the tools to write clean code and great architecture, it doesn't offer the best tools to do that job.

Ruby and Rails feel best as a prototyping platform for getting something out the door fast, proving a concept, and not worrying so much about correctness or maintenance.

I don't think if you are doing a lot of TDD and larger systems that piling on more Ruby and Rails is the right answer. I think once you know what you are working with, a well designed language with a compiler is a huge help and would remove a ton of useless tests and stuff that you end up writing in Ruby by hand.

This very likely leads to an API driven system with an API written in a strongly typed, compiled language like C#, Java, Scala, Haskell, or Go and writing your front end in whatever makes the most sense for your team.

At that point you get the benefits of a nice rapid development platform like Rails for your front end, and a fast, testable, API written in something else using all the clean code architecture you want.

The trick is, you do everything ugly in Rails or PHP or whatever in your initial prototype and you might not even write tests. You just ship working code until your prototype has proven a business case. Then, you move it towards something with lower TCO over time. Make the investment in your code when it makes sense to invest in it and use the best tool for the job at each step.

You probably never need to leave the standard Rails MVC stuff on most projects unless they are wildly successful and/your long term needs change. Even then, you can probably keep the rails front end stuff and just talk to an API and be very happy.

3
ollysb 4 hours ago 5 replies      
Perhaps because it's written with examples in java but I often feel like no one in the rails community has ever read Eric Evan's Domain Driven Design[1]. It's far and away the best material I've ever seen on how to organise large code bases. It covers pretty much every suggestion that I've seen from the rails community. Sometimes the rails community can feel like the fitness industry, everybody just rebranding things that have been done before.

[1] http://www.amazon.com/Domain-Driven-Design-Tackling-Complexi...

4
pothibo 4 hours ago 0 replies      
One thing I really enjoy with this post is the conclusion. It doesn't try to tell you it's the only way, just that it's the way they found to fix their problem with the constraints they had, as it should be.

Rails is easy to extend, people often forget that. Great post.

5
rjspotter 1 hour ago 1 reply      
I agree completely that the Interactor pattern makes for cleaner Models and Controllers in a Rails codebase. I've used both the Interactor gem and PORO (in a directory outside of /app or /lib).

Having worked with the Interactor gem for a little while once you break things down into small interactors that can be used by the Organizers, I have two main complaints.

1) inputs are unclear. With calling new or some custom class method you can use descriptive variable names to say what the object expects to be present to do it's job. With the Interactor gem you end up adding in comments describing what keys need be present in the context and what keys will be added to the context so the next programmer can use the interactor you've created without having to go through and grok everything.

2) You end up having to (re)create a failure protocol to communicate with the controller and display to the user. We take the AR errors functionality for granted in our controllers/views with interactors you have to come up with a similar system.

2.5) as a result you end up writing a lot of boilerplate fail! and rollback code

2.5.2) and a non-atomic operation like notifying the payment gateway can break the whole model of rolling back and you have to end up raising so your user doesn't end up in a invalid state or get charged twice.

6
tzaman 4 hours ago 2 replies      
We tried using DHH's concerns in place for interactors, but ditched them for PORO/service objects because they can be tested outside Rails.
7
kapilkale 3 hours ago 1 reply      
Can anyone recommend a github repo of a Rails app implementing interactors or any of the similar concepts listed in the comments?
8
wdewind 4 hours ago 1 reply      
This is great, and does not only apply to Rails. In many other frameworks it's common to lump business logic in controllers instead of models. Either way you end up with the same thing: eventually you'll need to add a 4th (at least) type on to MVC.

> Here at Grouper, were long-time users of Ruby on Rails along with other New York startups like RapGenius, Etsy and Kickstarter.

Etsy doesn't use Rails though, it uses PHP.

9
Axsuul 4 hours ago 3 replies      
I've been looking into incorporating this pattern into my larger Rails apps as well. Another benefits of interactions is DRYing up your code for use in APIs. Some of the more popular interactor gems:

https://github.com/orgsync/active_interactionhttps://github.com/cypriss/mutations

10
nwienert 3 hours ago 2 replies      
I've found a happy balance developing with rails by adding two strategies:

1) Use presenters for display-only logic to keep controllers concerned with managing requests only.

2) Using service object / custom classes in /lib for actions on models as well as abstracting any common related functionality. Creators, Updaters, Processors, Orchestrators, etc. Keep your models only concerned with data and not data transformation, and your controllers from doing it as well.

11
casey_lang 4 hours ago 0 replies      
Along the same line but heavier weight than 'Interactor' is 'ActiveInteraction'[1].

Responds in much the same way active record does, allowing validations, errors, forms built from interaction objects, etc.

[1] http://github.com/orgsync/active_interaction

12
edwinvlieg 4 hours ago 0 replies      
We've also come across the limitations of Rails as a one-size-fits-all solution for larger codebases. I don't think this is a fault of Rails, but more of the developers using the framework. Software engineering in general has more focus on a wide variety of design patterns and doesn't limit itself to one framework. In my opinion, Rails is the best framework to create wonderful web applications. Modeling business logic requires different kind of architectures than the simple MVC-ish structure Rails provides.

At MoneyBird we are using the Mutations gem to represent something like the interactors mentioned in this article. One major advantages of Mutations, is that is also does input filtering.

https://github.com/cypriss/mutations

13
seancoleman 3 hours ago 3 replies      
A good litmus test of an experienced Rails developer is how large their /lib directories are in relation to project sizes.
14
troels 3 hours ago 5 replies      
One thing I can't figure out is whether to place my service objects under `/app/models` or under `/lib`. There doesn't seem to be a clear consensus on this? I tend more towards the former, because autoload works better during development. Also, I consider them part of my domain model. What do you do?
15
avitzurel 3 hours ago 1 reply      
As always, the question is should be in Rails or not.

I think that the answer is clear, this should not be a part of rails since it's not true to all apps.

I think that if you want to have something real quick, you don't need an interactor/service class.

When you have a bigger app, you definitely need that, or you will get to a point where you code is split into models/observers/callbacks/lib/app/concerns and you can't find anything.

Rails generators are pretty easy to extend, this way, when you generate a new model, you can easily create the service class for it.

You can also EASILY create new generators that will generate the service classes for you with your defaults and templates and what not.

Not sure Rails is missing that, however, it is definitely a best practice that people with bigger apps should use TODAY.

16
clutchski 3 hours ago 1 reply      
I thought Etsy uses PHP. Could be wrong.
11
Downloading Software Safely Is Nearly Impossible noncombatant.org
186 points by danielsiders  7 hours ago   138 comments top 15
1
kefka 5 hours ago 5 replies      
The problem is much worse than this contrived 'I cant download PuTTY securely'. Lets choose an example, of which I have had my hands in with my tech support job.

*Goal: "Download Firefox"

First, the user was using IE. And the user is not a tech savvy user (as in, cannot read words on the screen). Turns out, the user's computer was infested with spyware and garbageware. Mainly Conduit and others.

Evidently, user "searched" for firefox rather than follow my directions to type in the address bar https://www.mozilla.org . This behavior lead him here: http://firefox.en.softonic.com/

Normally, I would use a remote support tool and just do the cleaning for the user. However, this client comes from another area in which we are not allowed to use the remote support tool.

In the end, I tried to have user uninstall the bad-firefox, and attempt to install the good, but the softonic installer installs a ton of crap everywhere. User got very frustrated and hung up when having him read the uninstall programs installed list.*

That is the danger to most users, running Windows.

EDIT: For the user whom penalized my comment score, why?

2
bcoates 4 minutes ago 0 replies      
It looks like Windows 8.1 is whitelisting PuTTY by hash or signature: nothing to see here.

Repro steps (Windows 8.1, desktop IE 11 or Chrome 33):

1. Download putty.exe from any shady source

2. PuTTY runs without prompting

3. go to mega.co.nz (an extremely shady source), upload your copy of putty.exe

4. download it again

5. this version of putty.exe also runs without prompting

6. open your hex editor of choice, change a byte in a text string

7. upload this tampered version of putty.exe to mega.co.nz

8. download and run it

9. observe full-screen modal red banner: "Windows Protected Your Computer" requesting an Administrator password to run suspicious binaries.

3
bad_user 5 hours ago 2 replies      
Before reading the article, I wanted to write a rant on why the TFA is wrong, based solely on the title :-) ALAS, I was wrong, especially because I downloaded Putty myself from putty.org, whenever I happened to play with Windows machines, without thinking once that putty.org is not the official source. And I'm a very security conscious user and if I can't protect myself, then normal users don't stand a chance.

Just a note - PGP signing renders HTTPS useless for downloading the binaries themselves and works by establishing a chain of trust, the problem is with distributing the public key. It's the public key that must be distributed either over HTTPS and/or through a public key server, letting other users digitally sign your certificate and thus endorse the association of this public key - a system that works great for popular repositories of software (e.g. Debian), in which the participating developers/maintainers know each other. Once the authenticity of the public key is correctly established, there's no way for an attacker to create/forge the signed binary, unless said attacker gets ahold of the private key, which is way more difficult than hacking a web server, as normally private keys don't end up on those servers (so it is more secure than HTTPS). For example, in Ubuntu if you're willing to install packages from PPAs of third-parties, you first need to indicate that you trust the public key with which those packages were signed, otherwise apt-get will refuse to install said packages.

A reasonable alternative to PGP signing is S/MIME signing, which is more user-friendly, as it doesn't involve the users vetting scheme, but rather certificates are issued by a certificate authority, just like with HTTPS/SSL. S/MIME is weaker against the NSA, but it does work well for signing stuff and it's more user friendly, because to establish trust, you only have to trust the certificate authority (and of course the developer).

Binaries on OS X are also distributed as signed with the developer's key and OS X refuses to install unsigned binaries or binaries signed by unknown developers, unless you force it to. And while I have mixed feelings about the App Store direction in which Apple is taking OS X, I've began to like this restriction, in spite of the money you have to pay yearly to register as a developer (as long as you can download signed binaries straight from the Internet and thus not completely locked into Apple's walled garden, it's all good). Signing binaries and having a user-friendly way to establish trust in the used signing key should be the norm in all operating systems.

4
throwaway812 5 hours ago 3 replies      
If you think you're safe: it's the same thing with Linux. Yes, good distros sign their blobs and you can probably verify that with builtin tools.

However, consider how distros generate their signed binaries:

1) A packager downloads a random tarball off the internet, often over HTTP and/or unsigned and unverified.

2) The packager uploads the same tarball to the distro build system (you trust them, right?)

3) The packager's script for building the program or library is executed by the build server (you trust all of the packagers, right? they have implicit root access to your machine during pkg install.)

4) The packager's script likely invokes `./configure` or similar. Now even if you trust the packager, the downloaded source has arbitrary code execution. You verified it, right???

(Not trying to advocate for webcrypto. And I'm a Linux user. But I'm also a packager, and I have some awareness as to how one would go about pwning all users of my distro.)

5
josteink 33 minutes ago 0 replies      
So basically he does a web search for "Windows ssh client" (generic seo spammed terms) when he knows he wants putty (specific) and is surprised that the official putty page is not the #1 hit.

I'd hardly call that a bulletproof argument.

6
dfc 4 hours ago 2 replies      
The moral is obvious. You can't trust code that you did not totally create yourself. -- Ken Thompson[^1]

[^1]: Reflections on Trusting Trust. ACM Turing Award Lecture, 1984, https://dl.acm.org/citation.cfm?id=358210

7
bphogan 4 hours ago 1 reply      
One solution I advocate for is more widespread adoption of Chocolatey (http://chocolatey.org).

I can

cinst putty

and get what I need automatically.

Sure, I have to trust the maintainer, but you know, if more people used Chocolatey to install packages, more people might be able to ensure it's safe.

It's not bulletproof but it sure is better than searching the web for the right download.

8
lmm 5 hours ago 6 replies      
Downloading software safely is nearly impossible on windows. Probably because there's no demand for it - people who care about security don't use windows. PuTTY is one guy's hobbyist project.

(If you insist on using windows, what about downloading SUA from microsoft themselves? That way you get a working SSH client without trusting anyone you weren't already trusting)

9
edwintorok 5 hours ago 0 replies      
Correction for step#10: the Putty keys are on the MIT keyservers, just not under Tatham's name, although they're only 1024-bit keys:http://pgp.mit.edu/pks/lookup?op=vindex&search=0xEF39CCC0B41...
10
larrys 2 minutes ago 0 replies      
"Its currently owned by someone named denis bider, who presumably just likes to domain-squat on other peoples product names and provide links. "

Another slam against squatters as usual. I really really wish people would stop with that already.

Whoever Denis Bider is he has no obligation to even put up links to putty. He could sell the domain name maybe even to these people who don't appear to be "using" (by the HN and generally acceptable definition of "using").In other words http://putty.com/

For the last time. There is no requirement to use a domain name and there never has been a requirement to use a domain name. And there are many people and companies who just sit on names and don't want to sell (because they don't need the money).

Talk to google about duck.com and see if you can buy it. You won't be able to.

Anyway he could put up a webpage as his personal blog or any number of things.

Just because you happen to have a product using a particular name does't mean you own that name in every tld (.com .net .org .info .us .biz and so on).

.org isn't even .com nor as desirable except perhaps for non profits.

11
RyanZAG 5 hours ago 1 reply      
Ah! A trick question game. The correct answer is to wipe off Windows and install Linux off your flash drive, right?
12
PythonicAlpha 5 hours ago 0 replies      
As much I understand, even HTTPS and its infrastructure has plenty of holes.

How was this, that some people broke into a signature authority and stole master-keys -- so a huge number of keys where compromised. I don't know, if that thing was repaired yet. Also there exist many authorities that give keys to people without the simplest identity check. Such keys are a security risk of its own.

I also don't know, how good (or bad) the key withdrawal mechanism is working currently. I remember darkly (I am not current in these things) that there existed some problems with existing browsers, infrastructure and so on ...

And even, when those things would work fine ... as much I know, there exist holes in the implementation, depending which algo combination is used.

So there are so many attack vectors, that even in the best case (https works fine and you have a domain that belongs to the correct author ... and you have checksums ... and you check, if your browser tells you, that the certificate is perfect (who in the internet age cares, when the browser says that the certificate has some problem??) ...) there seems to be no security in the internet age ....

(And I am not even speaking or thinking about governments spying on us all)

13
MichaelGG 5 hours ago 0 replies      
I've become acutely aware of this over the past couple days. I'm setting up a new a laptop, using VMs for all work. Getting VMware is easy - it's signed. But from there? Things start sucking. I need to fix my "ThinkPad" fan and trackpad (new ThinkPads don't actually have a middle button despite the dots appearing like they are one) - gotta download unsigned blobs.

Since I want as little software installed on the host as possible, I'm going to have to start a VM on something like Azure (easiest) with Visual Studio, and build my own copies of these tools if possible. The culture of building stuff on Windows is fairly weak, so I imagine I'll run into all sorts of issues.

It's pretty embarrassing that Windows doesn't ship with a lightweight way of creating "VMs" to increase security. Something like Sandboxie would be a welcome piece of OS functionality.

The JS crypto comment is off-base. The discussion about JS crypto is that it's pointless because it's only as strong as TLS - it doesn't provide anything else, and it's very easy to get it wrong and get more damaged (due to ease of XSS and whatnot). Sandboxed execution is a fantastic thing, and even MS tried that with .NET and it's million code-access-security policies. And now everyone does that with Android/Windows Store style permissions (although not as fine grained).

14
pdonis 2 hours ago 0 replies      
The title is missing a word: it should be "Downloading Windows Software Safely Is Nearly Impossible". Similar remarks would apply to OS X for any software not supplied by Apple. Fortunately, Linux distros have package managers.
15
m0dest 5 hours ago  replies      
People complain that OS X requires apps to be signed by Apple (by default). But in reality, it's the sanest solution to this problem.

When the OS enforces signature checking, you don't have to worry about whether it was downloaded over HTTP or who owned the domain name.

12
Say Hello to Cortana, Microsofts Siri Equivalent gamerevolution.com
16 points by btimil  1 hour ago   19 comments top 8
1
Uhhrrr 3 minutes ago 0 replies      
Possible branding problem: Cortana the Halo character goes insane after seven years.
2
TrainedMonkey 52 minutes ago 3 replies      
Nice Halo reference! Microsoft already has Kinect, so they are not starting from scratch in terms of voice recognition/processing.
3
madsushi 33 minutes ago 0 replies      
I can see it now.

"OK, Cortana, show me how to get to Mars."

http://www.youtube.com/watch?v=NCCk1atehQc

4
bitwize 19 minutes ago 0 replies      
What's next, GLaDOS on our Steamboxes?

"GLaDOS?"

"I hate you."

"Bring up Team Fortress 2"

"Wouldn't you prefer a nice game of chess? Or how about 'Pass the Hot Deadly Neurotoxin'?"

5
basicallydan 44 minutes ago 1 reply      
Last time one of the big tech companies made a sci-fi reference for a new product it was a failure :( remember Google Wave?

One of these days, someone will build our dream Sci-Fi future. At any rate, Cortana is a good name.

6
Nux 42 minutes ago 2 replies      
Cortana, Siri ... Who comes up with these names?Thank fsck for Watson.
7
higherpurpose 11 minutes ago 0 replies      
Cortana sounds like a mouthful. Imagine saying "Cortana" everytime you want to tell it something. And I imagine it would be even more embarassing to do it in public than it is for saying "Siri".
8
theChips69 41 minutes ago 1 reply      
Nerds
13
Jurassic Park computer system in the browser jurassicsystems.com
221 points by tojr  9 hours ago   64 comments top 25
1
malux85 8 hours ago 3 replies      
Wow! This is cool.

The computers in Jurassic Park is what got me into programming when I was young. I saw their 3D weather overlay when the storm was approaching and thought "THAT IS IT!". I was 8 years old at the time.

I was determined to learn and create that system - I started learning C, so that I could open and close files. I learnt OpenGL so that I could create the 3D scene. I learnt socket programming so that I could download weather images from the NOAA.

It took me until I was 15 (I was totally self taught and it took me a while before I could learn all of the vector / matrix math myself) but I ended up building it! I made a program in C that downloaded the weather jpg, cleaned it up (removed noise, removed the NOAA logo, smoothed out the clouds) then generated a heightmap in OpenGL, and made a fly-over.

I wish I still had the code, it's sitting on my old computer at the Farm in New Zealand :<

2
bhouston 8 hours ago 3 replies      
SGI IRIX operating system: http://en.wikipedia.org/wiki/IRIX

I used that at my first programming internship at university. Even then (1999) it was a dated OS, but it was still fun it.

I wonder if they used IRIX in the movie because the VFX guys making the movie, ILM, were doing all the VFX work using Softimage on SGI machines running IRIX at the time.

3
hcarvalhoalves 4 hours ago 0 replies      
The cool thing about Jurassic Park is that it showcases all this high-end tech (at the time) to set the mood. The setting of this movie made a huge impact on me as a kid.

In the computer room scene you can see a CM-5 [1] from Thinking Machines in the background (or at least, only the front panels with the futuristic blinking red leds). Those were very interesting computers when launched, with a fundamentally different architecture for parallel computing.

Also, fun story: the company hired Richard Feynman [2].

[1] http://www.sgistuff.net/funstuff/hollywood/images/hwd_jpark9...

[2] http://longnow.org/essays/richard-feynman-connection-machine...

4
Aqueous 8 hours ago 3 replies      
The Jurassic Park Computer System...created and maintained by Newman. Credit where credit is due: it takes a lot of skills to keep a bunch of velociraptors pent up using a Macintosh LC II.

What always amused me is that in order to trigger the locks, the "computer whiz" girl had to navigate some sort of 3d control environment, probably the most ineffective way possible to control something that should be random access.

5
famousactress 8 hours ago 0 replies      
Haha! My mom was a Sun reseller who sold them a bunch of Sparc stations for the movies and I think even wrote some shell scripts to launch some spinning 3D dino-skull scripts on startup, etc. Around this time we were a pretty rare family home with one of those boxes and an ISDN line at home as the family computer.
6
dmm 8 hours ago 0 replies      
Pour one out for IRIX, discontinued December 31, 2013.http://en.wikipedia.org/wiki/IRIX#Retirement
7
k-mcgrady 8 hours ago 1 reply      
Really nice recreation. Why the annoying dialog repeatedly telling me not to use Safari though? It seemed to work fine for me in Safari.
8
tylermauthe 8 hours ago 1 reply      
Some other commands...

> access main security system please

> display zebraGirl.jpg

9
capkutay 31 minutes ago 0 replies      
Weird I can't get into the security system..
10
erlkonig 2 hours ago 0 replies      
I used fsn on the SGIs in 1994/5, but I think this initially free (as in beer) program was later bundled with some for-pay package and vanished from the downloads area.

The funny part is I probably still have a working copy of it on the SGI Onyx in my garage.

11
edgeman27 9 hours ago 2 replies      
It's a Unix system. I know this.
12
basicallydan 5 hours ago 0 replies      
You should've seen the grin on my face when Newman's smug little face started speaking to me. You've fulfilled a childhood dream for me, thank you.
13
astrojams 6 hours ago 0 replies      
This brings me so much joy. This movie - the SGI and this scene in particular made me want to switch from being a business major to a computer science major in college. I did and it was the right decision.

I bought an SGI Indy and spent a lot of time in the shell just so it would look and feel like the scene from this movie. I don't know why it affected me so much but it did.

So much love for the author who create this simulator.

14
edgeman27 9 hours ago 2 replies      
I like the irony of the site telling me not to use Safari.
15
fennecfoxen 9 hours ago 0 replies      
16
yashg 8 hours ago 1 reply      
Where's the 3D model to lock/unlock the doors?
17
trekky1700 4 hours ago 0 replies      
Having loved this movie since childhood, this was really awesome.
18
cmatteri 4 hours ago 0 replies      
Play this song for the full experiencehttp://www.youtube.com/watch?v=zQuH4woPDn0
19
banachtarski 5 hours ago 0 replies      
The zebra girl is a nice touch.
20
shalander 9 hours ago 0 replies      
You didn't say the magic word!
21
filmgirlcw 6 hours ago 0 replies      
As a kid, this was my dream system. This is so great.
22
angelgcuartero 7 hours ago 0 replies      
I used to work with some Silicon Graphics in the 90's in a F-18 Simulator for Spanish DoD. We used C with GL (the graphic library that existed before OpenGL) and had Indigo workstations for each developer. Origin and Onyx were used for calculation and visualization. When they offered me that job I couldn't refuse! I was like "Yes! I'm going to use the same computers that appeared in Jurassic Park!!!" :D
23
cessor 7 hours ago 0 replies      
Did anybody notice the picture of J. Robert Oppenheimer next to the screen? Details, my friends...
24
guiomie 6 hours ago 1 reply      
What is the hidden feature ?
25
simon1246 9 hours ago 0 replies      
legit
14
I don't know 42floors.com
108 points by do  6 hours ago   44 comments top 17
1
saosebastiao 4 hours ago 0 replies      
I swear to god, every time I read a post by Jason, I wish he was my manager. I consider myself a highly productive person, but constantly hampered, directly and indirectly, by terrible managers. Jason always comes off as the kind of person that amplifies the abilities of his team. That is all...I'm gushing and its embarrassing.
2
alexandros 2 hours ago 1 reply      
The problem is that the engineer's "I don't know" gets beaten out of you when you speak to users and investors. You get made to feel like an amateur for admitting uncertainty. Which is absurd in the startup world of all places. So it's one of those doublethink situations, where you need to keep your internal accounting separate from the marketing talk. And then you need to know who to tell what on top of that. And then actually deal with the actual uncertainty on top of that. As if startups weren't hard enough in general, we're making it even worse for ourselves.
3
nsomaru 4 hours ago 1 reply      
In Indian culture, knowledge (viveka) is always accompanied with humility (vinaya).

Thus, counter-intuitively, saying 'I don't know' is an indicator of the speaker's knowledge, as he is aware of areas of his ignorance.

Contrast with an arrogant attitude of "I-know-everything-or-can-find-it-out" and the fuck-ups that inevitably follow.

Aside: I would really like to know what the solution to keeping the listings updated was. Or, is that a trade secret? ;)

edit: s/Aide/Aside

4
georgemcbay 1 hour ago 0 replies      
Good point, and I agree, I also enjoy working with people who can admit when they just don't know something. It shouldn't be so hard to do, there is a nearly infinite amount of stuff each of us doesn't know -- especially when it comes to forward looking challenges we just haven't started thinking concretely about yet. My favorite response when asked about such things is "I'm not sure; we'll burn that bridge when we get to it".

This topic is even more relevant for software over other types of knowledge, IMO, keeping in mind the Knuth quote "Beware of bugs in the above code; I have only proved it correct, not tried it".

Even when I'm pretty sure that task XYZ can be completed relatively easily using language A, framework B and API C, I still won't answer in complete absolutes unless I've worked on the compiler for A, written a fair amount of B, and previously exhausted all the functionality of C.

5
Jemaclus 4 hours ago 2 replies      
It's not just "I don't know" that is important, but also the next part of the thought: "but I can find out."

Just admitting you don't know is great, but it's not good enough. You have to also be able to take the next step and find out.

If I ask two people if they know how to do something, and one of them says "I don't know" and the other says "I don't know, but I can find out," then... well, I don't really need to complete that thought. You can see the difference right there.

But other than that, Jason is spot on. This is something I've been keeping an eye out for for years. When I interview people, I ask a question, and when they're clearly bullshitting me, I'll stop them and say, "It's okay if you don't know." The best candidates will often back down, like Jason did in his example, and say "Yeah, I don't really know, but I'll figure it out."

Bullshitting me with an answer isn't a red flag. It means you're at least thinking about the problem. But admitting you don't know but are willing to figure it out is a far, far superior answer. Very few real-world problems demand an answer RIGHT NOW. Most can wait until some research is done.

6
zaidf 4 hours ago 0 replies      
If you found the post interesting, you may also like reading how Kiran seems to have attacked the question he posed to Jason: http://divvela.com/post/77189320557/an-army-to-help-you
7
johngalt 4 hours ago 0 replies      
Someone who has all the answers, probably has an important answer wrong.

Also, don't look at people's plans, look at their criteria.

8
scrabble 2 hours ago 0 replies      
If I don't know something I've made it a point to tell them that.

Generally, I follow it up by letting whomever know that I can attempt to discover an answer to the question.

If the question is about how something works, I'll occasionally explain how I think it might work based on how I'd implement it, but again advise that it's just a guess and I really do not know.

9
cmbaus 3 hours ago 0 replies      
While the point of the article isn't the domain problem, I'll add that the hardest management job I've had was trying to retroactively fix a data aggregation process that had been built manually over many years.

The work was extraordinarily tedious, error prone, yet business critical. We had to keep shimming in new parts of the system, while keeping the old system afloat. While the system was always on the verge of imploding, from the outside, it basically appeared to work, which kept the project chronically understaffed.

I have a lot respect for anyone who aggregates time critical production data from different sources. The nature of the work makes it extremely stressful, yet surprisingly easy for outsiders to underestimate.

10
collyw 4 hours ago 5 replies      
I wish politicians would have this attitude instead of avoiding the questions and coming out with meaningless statements.
11
btilly 4 hours ago 0 replies      
My attitude is that preemptive honesty establishes credibility. Even more when it is backed up by everything else you do.

By contrast up front evidence of defensive behavior is a red flag for me in all sorts of situations.

12
kliao 4 hours ago 1 reply      
I really like this idea and try to apply it as much as possible. In my experience, for any kind of teamwork, the faster we get to the "I don't knows", the faster we solve problems, while stating half-truths/guesses-as-facts spreads confusion and kills productivity. I've also noticed that the type of person who is afraid to admit knowledge gaps is usually also the one who looks at me funny or expresses disapproval whenever I use the phrase. It's as if they are confused as to why I would ever admit such weakness.
13
thomseddon 3 hours ago 1 reply      
<title>the_title();</title>
14
adamzerner 2 hours ago 1 reply      
> Kiran explained that he likes it when people say I dont know because it lends credibility to everything else that theyve said.

I'd take that a step further, and say that people who assign proper levels of certainty to their beliefs tend to be credible. Most people seem to only be able to think in absolutes.

15
tadmilbourn 4 hours ago 0 replies      
I like to think of this as being "confident in your ignorance." Whether you're a manager or early in your career, the way in which you admit what you don't know and ask clarifying questions can have a profound impact.
16
larrys 2 hours ago 0 replies      
"While there are lots of tactics, there is no one true silver bullet. "

I'm old enough to have worked with Coldwell Banker when they rolled out their commercial real estate operations on the East Coast (they had been only a west coast company at first).

What they did was assign a college student (typically on summer break or part time) to go out to each area and literally catalog all the commercial office space for part of the time and then the rest they would spend in the office calling up the building owner (or manager) and verifying the data collected.

This was many years ago but it worked quite well. A database was compiled and updated.

Bottom line is no matter how you cut this is a manual process.

You know why the yellow pages was such a success? Relevant data that while it came out 1 time per year typically was highly accurate for that year.

They made enough money (by selling advertising to a captive audience) that they were able to employ an entire quality checking and sales organization to verify and collect the data which ended up in a product that was widely used up until the internet came along. But once again they had the profit margin and captive audience to make the whole process work and that was a key part of that success.

15
Alcatraz Package manager for Xcode 5 alcatraz.io
121 points by mneorr  6 hours ago   48 comments top 21
1
Argorak 3 hours ago 2 replies      
Um. Don't run the installation instructions.

While the download itself is served using https (from amazon), curl will contact the google url shortener using HTTP. Honestly, if I wanted to MITM one thing on any network, URL shorteners would come first.

Edit: The website switched from googles link shortener to git.io (http) and download to github downloads. git.io's https version seems to have certificate issues.

    > curl https://git.io/lOQWeA -vvv    ...    * SSL certificate problem: Invalid certificate chain    ...

2
guptaneil 6 hours ago 4 replies      
The first thing that stuck out to me was the name and logo. Alcatraz sounds cool (and I like the logo), but I'm not sure if something that evokes imagery of being confined and locked up is what you want associated with an open-source package manager.
3
eridius 39 minutes ago 0 replies      
Xcode doesn't have a public plugin API. Every single one of these packages, including Alcatraz itself, is relying on undocumented and unsupported functionality. I would very strongly caution against installing any of it.
4
nextstep 4 hours ago 1 reply      
Very nice! I've been following this for some time and am very happy to see this stable release!

An aside: Has anyone tried the Clang Formatter plugin? I want to format property declarations like:

  @property (nonatomic, strong) NSString *string;
But setting 'ObjCSpaceAfterProperty' to true or false both output:

  @property(nonatomic, strong) NSString *string;
My format config is based off llvm's. Maybe some other configuration is stomping on the 'ObjCSpaceAfterProperty: true'?

(Does anyone have a .clang-format file that that matches Apple's style?)

5
k-mcgrady 6 hours ago 5 replies      
What are your reasons for starting another package management system when we already have CocoaPods? What does this offer over CocoaPods beside the UI?
6
HaloZero 6 hours ago 0 replies      
I've been using this and it works great for the packages it has, I don't use too many but if you document your xcode methods I highly recommend VVDocumenter.

It allows you to generate a doc string for a method if you type '///'

7
orta 5 hours ago 1 reply      
A huge congrats, I've been working with Alcatraz HEAD for a while, with helping out with the design, and occasionally wanting to make my own plugins.

I'm super excited to see it out and one-click installable again. Looking forwards to seeing what Marin/Delisa/Jurre do with the blog.

I use this regularly, it's not felt any less stable for the few plugins I use mainly; open in github, one in appcode & fuzzy string matcher. They really make Xcode easier for day to day life.

8
eddieroger 6 hours ago 1 reply      
This looks really cool, but I don't know what problem it's solving for me. Maybe I'm not an Xcode hacker/ninja/whatever, but it has constantly met my needs as-is. Sure, I've wanted to add a color scheme before, but these are a lot of hoops to jump through when Dusk is fine.
9
biot 3 hours ago 1 reply      
Are the plugins run in any kind of sandbox? Is there any security review performed on plugins other users submit?
10
wsc981 5 hours ago 0 replies      
Very nice tool, though it seems the installation of ClangFormat[0] fails somehow. I guess I'll try to install it manually.

[0]: https://github.com/travisjeffery/ClangFormat-Xcode

11
chrisdevereux 6 hours ago 3 replies      
How stable do people generally find Xcode plugins?

The few I've tried haven't been great (they've tended to crash a lot and break with Xcode updates)

12
yohann305 2 hours ago 0 replies      
Now, add a price field, and sir, you got yourself a sustainable business!
13
theswan 3 hours ago 0 replies      
Interesting that just a little bit ago there was an article on software installation security (albeit from a different angle).

Theoretically speaking, is it safe to curl and install something via plain http:// and no checksum verification?

14
yohann305 2 hours ago 0 replies      
OMG! It's like "XCode on Rails"!

At last!

15
bound008 4 hours ago 0 replies      
YES! It's back!
16
accatyyc 6 hours ago 0 replies      
Wow! Thank you for this. In 5 minutes I found 5 plugins I had no idea existed but truly brightens my day. Keep up the good work!
17
eliperkins 6 hours ago 0 replies      
As someone who spends most of their day in Xcode, Alcatraz is an invaluable tool to grooming my Xcode setup and keeping up with the latest plugins and what not.

Congrats on the launch Marin! Been following the repo for a few months now, I'm really digging the design.

18
bib971 4 hours ago 1 reply      
Just curious, how to implement the scrolling effect like this site? Does it require JavaScript or just CSS?
19
danielrakh 5 hours ago 0 replies      
I think it's awesome. A part of me wishes there was a drop down summary of each plugin when you click on it, rather than heading to the github link.
20
kerbs 6 hours ago 1 reply      
Crashes XCode for me when entering the package manager.

I work for TheMan (with Firewall) and assuming it is due to some non-http ports being used?

21
ahmadeus 5 hours ago 1 reply      
Great work, now the next big thing is to mix this with Cocoapods to that once you select a package, it does the pod install command for you automatically and you are good to go.
17
Why We Love You aggregate positivity for your friends whyweloveyou.com
34 points by rfong  3 hours ago   5 comments top 2
1
chadillac 1 hour ago 3 replies      
I hate to be a Negative Nancy but this is like a cyber bully wet dream. There is no authentication measure to ensure who is who, it gives the user a vested interest in hearing what good things people have to say about them, therefore is basically a point that can be easily leveraged to conduct the most malicious of attacks against someones character and ensure it is seen but more importantly done in a manner to cut very deep.

As you can see Barak Obama is a kitten killer in your demo.

http://www.whyweloveyou.com/reesekitty

2
olav 1 hour ago 0 replies      
Strange, I made a very similar thing a few years ago at http://dankbarkeit-ist-nicht-erforderlich.de/
18
OpenStreetMap provider CloudMade shuts its doors on small users ericjiang.com
30 points by erjiang  3 hours ago   15 comments top 7
1
Doctor_Fegg 1 hour ago 0 replies      
A few pointers for anyone stranded by this.

Tiles:

* MapQuest Open: free tiles, good all-round cartography, high availability and very generous terms of use. The default option for anyone who "just wants a map". http://open.mapquest.com/

* MapBox: need no introduction, easily the biggest and slickest OSM-powered operation out there. Some really great technology, smart guys (they've hired a lot of the brightest stars of OSM), and a bigger commitment to "giving back to OSM" than CloudMade had. Wide range of plans from free to "Enterprise". http://mapbox.com/

* Thunderforest do a range of attractive specialist cartographies, of which OpenCycleMap is the best known, and can design and host styles to order. http://thunderforest.com/

* Set up your own server: reasonably easy for anyone with a little Unix experience, runs fine on a 59/month Hetzner box. Docs at http://switch2osm.org/ disclaimer: my site)

* And for the client-side library, pretty much everyone uses Leaflet now: http://leafletjs.com/

Geocoding:

* MapQuest Open run an instance of Nominatim, the standard OSM geocoder, again with high availability and generous ToUs. Setting up your own Nominatim instance isn't trivial, so use MapQuest's unless you have a really good reason not to.

* Mapbox has a basic geocoding API that comes as part of their plans.

Routing:

* OSRM is the standard router, capable of blisteringly fast results which facilitate Google-style draggable routes. Source and public instance at http://project-osrm.org/ . Be aware that building the routing graph for the whole world takes a lot of RAM (though you can EC2 it if you like). The author, Dennis Luxen, has been hired by MapBox whose Directions product is expected imminently.

* MapQuest Open also offers routing, slower than OSRM but a nice free public instance.

* GraphHopper is gaining some traction for lower memory requirements than OSRM, though also fewer features. Java. http://graphhopper.com/

All-round OSM shops:

* Geofabrik are very much at the heart of the OSM ecosystem - pretty much everyone has downloaded their invaluable data extracts at one time or another. Hugely knowledgeable, helpful and capable of pretty much anything you can throw at them. http://www.geofabrik.de/

* Gravitystorm, the OSM consultancy side of Thunderforest: http://www.gravitystorm.co.uk/

(And a personal note: I do cartography (but not hosting), plus general OSM knowledge with expertise in editors and in routing inter alia. Link in the profile as per usual.)

2
dboyd 2 hours ago 2 replies      
Just started using OpenStreetMap on the weekend. I found Mapbox[1] to be very easy to get started with. Their plans also seem to specifically benefit the 'small user' (I.e., free developer account). At the same time I have to assume they serve big customers well too, since Foursquare is listed as a customer.

I'm no OpenStreetMap expert (yet), so maybe a CloudMade Mapbox comparison isn't entirely fair.

[1] https://www.mapbox.com/

3
ris 2 hours ago 0 replies      
Anyone in the OSM community will be unsurprised by this. Cloudmade has done little of any use to anyone for quite a while. This news means next to nothing for OpenStreetMap.
4
corford 1 hour ago 1 reply      
When I got the email that they were shutting down the API at the end of April, the first thing I rushed to check up on was the status of Leaflet.js (which I think had been sponsored by Cloudmade?). Fortunately, it seems it's now sponsored by Mapbox: http://leafletjs.com/2013/11/18/leaflet-0-7-released-plans-f...
5
lutusp 38 minutes ago 1 reply      
Quote: "Now, were left with almost no options for custom hosted OSM tiles."

As to "custom hosted tiles", yes, but you can always download the entire current OSM database, or any selected subset, and generate your own tiles by converting from the original vector data. There are a number of programs designed to do this for you.

6
mtmail 2 hours ago 0 replies      
There are even more alternatives for small users and providers seem to start every couple of months, e.g. https://geodienste.lyrk.de/pakete (German), http://thunderforest.com/, http://www.toursprung.com/products/maptoolkit/ and the OSM wiki lists http://wiki.openstreetmap.org/wiki/Tiles#Servers
7
markwillis82 2 hours ago 1 reply      
This is such a shame.

Having used the OSM data to generate our own tiles. It takes a lot of time and resource to manage them well (let alone updating the OSM data)

I understand it must be tough to provide such a service at low cost. Hopefully something will come up from the ashes.

19
Keurig Will Use DRM In New Coffee Maker techdirt.com
73 points by midnitewarrior  6 hours ago   96 comments top 20
1
Osiris 1 minute ago 0 replies      
I suppose that one of the benefits of not drinking coffee is not needing to worry about all of this.

The lunch room at the office has a Keurig machine and it gets jammed a lot because of the air pressure differential (Denver is at 5200 feet). If you don't properly puncture the package when you insert it, the air pressure causes something to go wrong and the machine jams up.

2
ignostic 3 hours ago 0 replies      
This case doesn't have anything to do with DRM as the title suggest. It's more clear in the article: "the java-bean equivalent of DRM". As far as I can tell it's actually just allegations of patent abuse and anti-competitive vertical integration.

Keurig had a patent on the technology for both machine and pod. That patent expired in 2012, allowing third-party vendors to start making pods. Treehouse Foods argues (among other things) that Keurig is changing the design of their machine and the pods. The allegation is that the new design doesn't have any practical purpose other than to maintain the patent on pods for several years.

If true, this is going to be an interesting case. Keurig can change the design of their machine whenever they want in theory, but can they do so for no other reason than to maintain their own market share? They'd be effectively forcing everyone to pay them to compete at all so long as people buy new machines.

Drug companies often employ similar patent tricks in order to maintain their dominance of a market via extended patents, patents on new uses for the same chemical, and slightly-altered and somewhat improved chemicals. Consumer groups have complained about this for years, but nothing serious has changed yet.

3
mikegioia 4 hours ago 10 replies      
I actively try to dissuade people from buying Keurig machines, or any "coffee pod" machine and this is just another reason why. Aside from the fact that the coffee is watery, the pods just add such an unnecessary amount of garbage to something that doesn't need to generate any waste.

I think the concept is cool but re-usable pods are the way to go. This is stupid of Keurig.

4
nsxwolf 4 hours ago 1 reply      
I'm not sure why they would attempt this. If they think they're going to get DMCA protection out of it, Lexmark already lost a case that seems to be this exactly:

Lexmark Int'l v. Static Control Componentshttp://en.wikipedia.org/wiki/Lexmark_Int'l_v._Static_Control...

All they can do is make it harder for competitors to make compatible cups, but that sounds like a fool's errand.

5
giarc 4 hours ago 4 replies      
Keurig is free to do what they want, just as a consumer is free to choose whichever coffee maker they want.

Maybe this will blow up in their face, or perhaps given their market share it will all work out.

6
jgh 3 hours ago 0 replies      
Well, if you like coffee and are in the US (or Canada maybe now, I'm not sure) I'd recommend checking out Tonx (www.tonx.org) I've been subscribed for about a year and a half now and can say that I really love everything they send us.

It's not going to get you some instant Keurig coffee but honestly you don't have 10 minutes in the morning to brew something decent? Like put it on and then get dressed or something.

7
shittyanalogy 4 hours ago 0 replies      
Regular Coffee: Grounds inside a tin can you scoop. Depending on your process, a thin paper filter. Coffee to volume is very dense for shipping.

Pod Coffee: Single use plastic pods with a foil lid inside a plastic bag inside a cardboard box. Coffee to volume is very sparse for shipping.

The benefit of not having to use a spoon is worth all the extra trash? Now with DRM?

8
jareds 3 hours ago 4 replies      
So what is an option for easily brewing a single cup of coffee without pods that will lock you into a specific machine? The things that I like about my Keurig are that you can get 4 to 6 cups before having to refill the water tank and that its easy to clean. While I dont mind having to throw away a filter I would like to avoid having to refill a machine with water every time I want another cup of coffee.
9
jessaustin 4 hours ago 0 replies      
This strategy seems kind of backwards to me. Keurig isn't a dominant household appliance marketer using that position to jumpstart a lucrative tie-in sideline. While that would be regrettable for consumers, it might actually work for the marketer. Rather, they are a food marketer who will attempt to defend their "primary" business with a tenuous temporary lead in a kitchen appliance category. How long will it be until Hamilton-Beach, Panasonic, Oster, or KitchenAid notice this market oddity and stomp them flat? Maybe Keurig should just roast better coffee?
10
joesmo 3 hours ago 0 replies      
"Such lock-out technology cannot be justified based on any purported consumer benefit, and Green Mountain itself has admitted that the lock-out technology is not essential for the new brewers function. Like its exclusionary agreements, this lock-out technology is intended to serve anticompetitive and unlawful ends."

This is not different from any other DRM mechanism. Unfortunately, that also makes it legal.

11
pwenzel 3 hours ago 0 replies      
I think I'll stick to my Chemex and compost my grounds.
12
kgermino 3 hours ago 1 reply      
>third-party pod refills that often retail for 5-25% less than what Keurig charges

How expensive are these pods? I certainly understand going generic to save 25% but a K-cup machine strikes me as a premium product. Maybe I'm missing something, but I'd think the people willing/able to pay extra for a high end coffee maker wouldn't waste time experimenting with generics to save 5% (what, a couple cents a cup at most?).

13
utopkara 4 hours ago 0 replies      
Good luck to them. Their coffee machines/designs are horrible. Hopefully this will cost them a huge market share and people will drink less plastic.

http://www.coffeedetective.com/is-the-plastic-used-in-keurig...

14
jcampbell1 4 hours ago 2 replies      
Does anyone know of a source for generic K-Cups that are decent quality? I seriously thought about doing it as a startup as it seemed untapped when I was looking.
15
Zelphyr 3 hours ago 0 replies      
So now I have to ONLY buy approved pods so I can drink coffee and tea that tastes like shit? No thanks. I'll keep buying my whole bean and using my AeroPress. Not as convenient but they're also not greedy doucecopters.
16
brianbreslin 4 hours ago 1 reply      
Didn't their patent on the keurig machines expire recently? These machines were invented decades ago, only hitting mainstream in the last 5 years.
17
drdaeman 3 hours ago 0 replies      
Well, that's disgusting. Like printer cartridge vendors.

Luckily, this applies only to pods and guess probably can't do anything like that with the machines using whole beans.

18
corresation 4 hours ago 2 replies      
I believe both Tassimo and the "Nespresso" already have DRM, although they pitch it under the premise that it's a coded system for the perfect brew.

So in many ways the Keurig is late to the party.

19
jotm 3 hours ago 0 replies      
Well, I guess it's back to the good old coffee machines, then.
20
magic8ball 3 hours ago 2 replies      
To play's devil advocate and stop the circle jerking, let's do an intellectual exercise and try to think about why this might not be the dumbest decision ever. Maybe it could be more than just a ploy to increase sales of 1st party cups. Maybe it's about Quality control.

I grew up playing a lot of Nintendo games. SNES. N64. Gamecube. Wii. There was one thing I noticed about it. Nintendo always made the best stuff. They made the best controllers. The best games. The best memory cards. I had a 3rd party memory card that said it stored "56 blocks." But if you tried to put more than 20 on it, you'd sometimes find your data magically lost in the morning.So despite the GNU/Linux ethos (I'm typing this on a Linux machine), freedom to use 3rd parties does not always give a better user experience. More than that, I found Nintendo also had to approve every single game and accessory that was compatible with their system (seal of approval). So even that was considered the good stuff. This seal keeps a certain floor of quality control. Back in the days of the Atari, people were making all sorts of shit for it. Stupid consumers would buy it, use it with their Atari, and be frustrated. Nintendo forbade this and made sure everything that was associated with their product met a certain standard, thus giving a good experience. So maybe, Keurig will use their DRM for this. A way to help consumers, perhaps, figure out which cups are good.

20
PulseAudio 5.0 freedesktop.org
25 points by conductor  3 hours ago   12 comments top 4
1
dfc 1 hour ago 4 replies      
Better release notes: http://www.freedesktop.org/wiki/Software/PulseAudio/Notes/5....

Personally Pulseaudio has always been "that thing that fixes all those problems I never had with ALSA." I realize there is more to it than that but I have never really understood what pulseaudio does for me a casual sound user? Sound on my computer is exclusively produced by mpd, vlc and flashplugin-nonfree and I do not remember the last time I used a microphone on a linux machine.[1] I have browsed through the docs a couple of times but I have never figured out why, let alone how, I would want to change anything.

The only thing about Pulseaudio that I am confident of is that it is less overkill than jackd for my use cases. What am I missing?

[1]: I guess I have used the microphone a couple of times but it was always because I thought I would try Skype again. But this always resulted in a ton of wasted time with 32bit libraries and an avalanche of qt dependencies. I would get skype to work but I used it so rarely that by the time I needed it again I had to go through all the nonsense all over again.

2
thomasahle 1 hour ago 0 replies      
'Changes at a Glance' from the release notes: http://www.freedesktop.org/wiki/Software/PulseAudio/Notes/5....

* BlueZ 5 support (A2DP only)

* Reimplementation of the tunnel modules

* Native log target support for systemd-journal

* Small changes here and there

* Many bug fixes

Personally it's probably been a year since last time I had to think about pulseaudio, which is the way I prefer it to stay.

3
hansjorg 2 hours ago 0 replies      
> This issue is yet to be resolved with the BlueZ developers

Anyone know the back story on the missing HSP/HFP support in BlueZ 5? Patent related?

Edit: it's mentioned briefly in the BlueZ 5 release notes (http://www.bluez.org/release-of-bluez-5-0/)

4
buster 1 hour ago 1 reply      
What's the major difference to 4.0?
21
Fast scraping in Python with Asyncio compiletoi.net
51 points by watermel0n  5 hours ago   6 comments top 4
1
thomasahle 1 hour ago 0 replies      
It's very interesting to see the Python libraries starting to experiment with the power of the new 'yield from' statement.
2
avitzurel 3 hours ago 1 reply      
Code samples on your blog are really hard to read. (the syntax highlighting colors make it impossible
3
bfwi 4 hours ago 1 reply      
You can also use grequests (originally a part of requests). It has a grequests.map() function, that I use exactly for this scenario.
4
drakaal 4 hours ago 0 replies      
Looks ok. I agree with author many people roll their own.

Depending what you want to scrape, this might work better for you.

https://www.mashape.com/stremor/stremor-content-extractor-fa...

It will scrape articles even if they are in multiple divs. It even works on magazine layouts like TheVerge.com that doesn't work in Readability (py, api, or js).

22
Apple CarPlay apple.com
407 points by lele0108  15 hours ago   327 comments top
1
GuiA 14 hours ago  replies      
Yay, more touch screen in cars. More ways to distract, frustrate and confuse users who are operating a big box of steel weighing 3 tons and going at speeds the human brain has never evolved to appropriately deal with.

1) Not only are touch screens a very poor interactor in the first place [0], but why do you think planes and other complex machinery have stuck with physical controls? For operating complex vehicles/apparatus, you just cannot do better than tangible controls. Knobs, switches, sliders can be operated without looking at them while giving rich tactile feedback, they have no modes = 0 risk for confusion, you know where they're going to be located on your dashboard regardless of what you're doing, etc.

2) Self-driving cars cannot come fast enough, and every single innovation in the car industry that does not go towards electric self driving cars is just useless fluff at this point.Seriously- then you'll be able to fiddle all you want with your phone, drink, travel while sleepy, arguing your spouse, whatever you want- we'll be saving tens of thousands of lives every year [1], and the secondary social benefits will be fantastic (less cars produced since they don't have to sit on a parking lot 99% of the time, people won't have to spend a year's worth of wages just to buy a car (and then a significant chunk to maintain it), etc.). If society were a game of Civilization, I'd be putting all of my resource points towards the "Self driving cars" achievement.

Of course the insurance companies, car manufacturers, oil companies, etc. don't want that to happen- but seriously, fuck those guys. The benefits on human society at large here are so significant that there is no room for caring about the feelings of greedy old white men.

[0] http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...

[1] http://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in...

(the views in this post are a bit on the extreme side- but that's how interesting conversations get started :-)

23
Thoughts on YC's Female Founders Conference techrotica.tumblr.com
55 points by mbellotti  5 hours ago   18 comments top 12
1
Udo 3 hours ago 1 reply      
It's expected that every word anybody said during that conference will be scrutinized beyond what's appropriate, simply because it's such a touchy subject. That's probably the reason why the video of the conference was already taken offline today while I was watching it (edit: it's not, see below). Generally I think it would be more productive if judgements weren't done on a hair trigger, because that's what makes this topic so toxic when in fact it shouldn't be.

Personally I found the presentations very entertaining and insightful - and even though I didn't have the opportunity to watch the panel to the end, I was struck by the impression that for the most part these were normal startup stories. That's a good thing, because it means while there is work to do when it comes to making the playing field fairer to women founders, we're at a state where the main issue is not "how to cope with my gender" but "how do we get our startup off the ground".

The area where I believe much more work is required is how we get more woman programmers started. The article touches very slightly on that, but then misses the mark:

> To get a male founder to admit he doesnt write the code his startup depends on you have to twist his arm. With a female founder its the second sentence out of her mouth. As if to say PS - dont take me seriously

A completely different way to read this is that people who code don't have a lot of status in circles of non-programmers. So a non-technical founder might in fact expect to be taken more seriously by professing hacking ignorance.

One reason why non-technical women founders are quick to volunteer that info might be a subconscious reflection of that nerd stigma, and of course that's probably a large reason why women don't become programmers as often as men in the first place. It seems to me that perception of social status is the core issue.

To be fair though it was a founders' conference, not a female hackers conference.

2
zaidf 4 hours ago 1 reply      
I dont know why this comment struck me as odd maybe its because the only people I know who identify themselves specifically as nontechnical cofounders are women.

This means that you seriously need to go meet more founders. There are probably as many non-tech founders who are men as tech founders. These founders(such as Alexis from reddit) have filled the role in a very similar manner as Jessica describes.

3
pron 44 minutes ago 0 replies      
This kind of article shows why we need more women in tech. Many, many more. Critical and insightful but also respectful and modest.

One thing that was missing, though, was politics. Politics in the true sense of the word, namely, the power struggles in the tech society which give rise to the current status quo.

Also, she says, "No one pretended tech was a meritocracy". The thing is, tech is a meritocracy, and that's part of the politics of tech. Meritocracy, of course, is a joke. It is not a desirable state of affairs, because it should be obvious that "merit" is a false currency used to justify what is, rather than work toward what should be. It is the quintessential naturalistic fallacy.

It amazes me time and again how people can take the term seriously, which only demonstrates how dangerous it is. Such a blatant, perverted joke, a dystopia that some intelligent people mistake for a utopia. Wikipedia says this: "Although the concept has existed for centuries, the term "meritocracy" was first coined in the 1950s. It was used by British politician and sociologist, Michael Young in his 1958 satirical essay, The Rise of the Meritocracy, which pictured the United Kingdom under the rule of a government favouring intelligence and aptitude (merit) above all else... In this book the term had distinctly negative connotations as Young questioned both the legitimacy of the selection process used to become a member of this elite and the outcomes of being ruled by such a narrowly defined group."

I still can't fathom how meritocracy can be taken as anything but a negative. I mean, the first question that comes to mind (or, rather, the second after "what is merit") is, "who has merit and why, and who does not?" Once this question is asked, it is immediately apparent that any attempt to paint "meritocracy" in a positive light is ludicrous.

4
thatthatis 3 hours ago 1 reply      
It occurred to me recently that if the goal is to eventually get to gender equality, female focused tech groups should set the proportion of their tickets available at x% female, y% male where x and y are the inverse of their proportions at some equivalent male dominated group. Over time, this policy should pull men and women together instead of apart.

What good is it if women only know other women and men only know other men?

5
gmays 2 hours ago 0 replies      
I posted my thoughts on the conference in another related thread here: https://news.ycombinator.com/item?id=7334744

But since it's relevant, I'll post them again here. TL;DR: I thought the conference was awesome and look forward to watching it streamed again next year. Here's my original comment:

This was a great conference. I didn't attend (I'm male), but I watched the livestream and learned a lot from the speakers.

I especially liked the conference because I learned about the female perspective. My wife isn't a startup founder, but has a similarly stressful job (flies F/A-18's in stressful, life-threatening situations), with similar hours (14hrs/day and often weekends), in a similar environment (male-dominated Marine Corps). I often wonder how she'll manage her work when we have kids...everything from pregnancy to making time for the kids when she's busy. It's easier for me since I work at home now, but she'll deal with the same challenges that some of the founders mentioned. And knowing her, she'll feel incredibly guilty for not being around.

Anyway, I enjoyed the conference, learned a lot, and look forward to it next year.

6
btrautsc 1 hour ago 0 replies      
> when youre the nontechnical cofounder, your job is everything thats nontechnical including grocery shopping and errand running. I dont know why this comment struck me as odd

As a non-technical cofounder, this is the truth. I'm not sure what expectations at large are, but I can corroborate this story and believe it should be most 'non-technical' cofounders expectation of reality.

Take out trash, buy office furniture, go on sales calls, arrange the company insurance plan, order dinner on late nights, line up investor meetings, talk to users... I could literally go on forever - those are off the top of my head from last week.

Technical cofounders (and employees) should be maximizing their impact by doing technical aspects - other founders should be selling, marketing, building the business, and sometimes have to order food, clean up the office, or go see the company lawyers/ accountants/ & run errands.

7
kyro 2 hours ago 0 replies      
An overall well-balanced article, but I find it hard to believe that Jessica meant anything more regarding non-technical founders than someone having to carry their weight by taking care of the majority of non-technical tasks in a startup whose product is likely wholly technical. I've heard male non-technical founders recount memories of making coffee for coders to keep them happy. If anything, she gave a more accurate depiction of the role than someone who calls themselves the business or product guy.
8
mountaineer 1 hour ago 0 replies      
If you're interested in following the results of this conference and/or discovering more female founders, technical or not, I made a large Twitter list [1] to keep up to date.

[1] https://twitter.com/ryanwi/lists/female-founders

9
mbesto 2 hours ago 0 replies      
> It wasnt just that she referred to herself as a nontechnical cofounder its that she repeatedly diminished herself and her own qualifications at a conference supposedly organized to stop other women from doing the same thing. Has no one ever pointed out to Jessica that at the end of the day YC is an investment company and that she was the only YC cofounder with actual investment experience? That, if anything, she was the ONLY ONE of that group even remotely qualified to be there? Not a tagalong in her boyfriends company who has to constantly apologize for her presence.

Interesting that was the OP's response. I simply took it as her being humble, not diminutive - so for me, her message was clearly delivered - take pride in whatever it is you do and do it well, regardless of role or sex. Interestingly, after hearing her keynote I thought she was the "technical" co-founder (she had investment experience in a investment based company) and pg was the non-technical co-founder (he was the hustler and gained the attention of the developers they would fund and match them up with investors).

10
nawitus 26 minutes ago 0 replies      
" and deal with the boys"

Well.. that started out pretty sexist.

11
jpeg_hero 2 hours ago 0 replies      
Filed under: no good deed goes unpunished.
12
the_rosentotter 2 hours ago 0 replies      
An unrelated nitpick (this is HN after all): The white-on-black text really bothered me. I am seeing ghosting a few minutes after reading the piece (which I read in its entirety because it was pretty good).
24
Q A Data Language q-lang.io
162 points by miloshadzic  10 hours ago   56 comments top 17
1
snorkel 0 minutes ago 0 replies      
I like it because you can practically use this to define a grammar for parsing a programming language, because:

1. Each data type can inline express its own value constraints (not just type constraints, but rather value constraints) which is key to parsing code

2. With subclasssing you can combine data types into higher level objects that inherit all of the lower level value constraints. This is similar to how a recursive lexer drills down from a statement into the parts of each statement.

3. The data type value constraints support sequences, alternatives, and unions, most of what a lexer needs to parse a language

With these ingredients you can actually parse code, and thereby write a code interpreter (or parse at least). You begin with a grammar rule (a high level data type) that defines a statement, and a statement is defined as a list of alternatives forms of a statement, and each statement form is defined as a sequence of keywords and supported value types, and so on down the language primitives.

... but I'm not sure if supports recursion, and thereby recursive descent parsing. Can you say something like "Sum = [Sum, Math_operator, Sum]" If not then you're limited to immediate values only and no nested expressions.

2
lkrubner 4 hours ago 1 reply      
I am surprised that EDN is not making more progress as a data interchange format, especially now that it has fantastic validators and coercion libraries, such as Prismatic's Schema:

https://github.com/prismatic/schema

This is, by far, the most intelligent data exchange system I have ever seen. It offers a richer language of types than what is offered by JSON, it is as readable as YAML, it is far more concise than XML. And yet it seems unable to break out of the world of Clojure. But it deserves to be taken seriously as a data interchange language.

3
skrebbel 9 hours ago 3 replies      
I really like this. In general, a well-supported non-crappy schema language for modern data exchange formats is long overdue (e.g. JSON Schema is horribly un-human-friendly). The fact that they aim higher and allow for a much more detailed set of validations is great, IMO.

I hope that this becomes a success and gets implementations in many programming languages.

5
tinco 9 hours ago 5 replies      
It's interesting, but is it powerful enough? There's a great talk by Zed Shaw about implementing authorization code. I think the same issue holds here.

A language might work for simple and more complex type constraints, but at some point the constraints might become so weird that you really need a full blown turing complete language to define them.

And that point might be sooner than you'd like. Why not instead of using expressions in this constrained language, use something like Haskell?

It's got the super powerful typesystem, and you're certain that it will be able to express any constraint you throw at it.

That said, this language does look nice, and I haven't really tried to find out if there's any obviously important constraints you wouldn't be able to build in it.

6
tel 8 hours ago 1 reply      
It's strange to me how dependent Q becomes on its host language. Instead of defining its own types in order to sit between various serializations and host systems, Q seems to simply augment the Host with a few extra types.
7
mercurial 9 hours ago 1 reply      
A "data language", in this context, is a language for representing and validating data. I'm not clear on why it's called "Q", or why it calls marshalling/unmarshalling undressing/dressing. The language itself looks alright, it's dependently typed. It doesn't say anything about the speed of the existing JS/Ruby implementations.
8
bane 7 hours ago 0 replies      
Oh dear, this might conflict with a SQL variant we cooked up at my old company called Q-Lang. Where you could write a SQL statement, leave bits of the WHERE clause "blank" and it would generate a GUI interface with drop downs and calendars and everything to let people fill in those bits without too much effort. Nothing terribly fancy, but it let you go from a working example SQL statement to an integrated query interface in the app in about 10 minutes.

It had lots of limitations, but most of the time it absolutely annihilated some of our competition's multi-month integration engagements.

9
islon 8 hours ago 1 reply      
In Clojure you can use schema (https://github.com/Prismatic/schema) which does the same thing without needing a separate language.
10
anonu 10 hours ago 1 reply      
I got excited at first - I thought they were referring to the Q language from KDB. This reminds me quite a bit of Google's protocol buffers.
11
hipsters_unite 6 hours ago 0 replies      
This is the second project named 'Q' that's been on HN in the last week or so. All cool ideas, but some Googling first by all parties would probably ease confusion.[0]

[0]https://news.ycombinator.com/item?id=7290655

12
nevi-me 4 hours ago 0 replies      
Am I missing something fundamental here? I think in the home page Q shouldn't be compared to just JSON as they are in principle two different things. If I were to write a 'data language' or to extend an existing language to cater for data, wouldn't I write my validators in that said language? If I am correct, then I was expecting Q to be compared to the closest other 'data language', not be compared with a document/schema.

For example, if I was doing it in JS, I'd say:var validDoc = true;var Temp = function (t) {return t >= 33.5 && t <= 45.0};

then when I get the doc, I loop through my conditions, as such:

validDoc = Temp(doc.temperature);

// reject if !validDoc

From here, the benefit of Q becomes that it's much easier to do validations, instead of writing them from scratch, considering that my example above only returns a true || false, as it doesn't have the error handling to let me know what went wrong and where.

One could also achieve some of what Q is doing by using an ORM if said data is going to a database/document store.

So, HN, am I missing something here?

13
bjourne 8 hours ago 0 replies      
Given that you use a powerful rdbms, such as postgres, you don't need anything more than sql constraints. For example, the constraint to check the temperature would be something like CHECK(temp between 33.0 and 45.0). To check that no two user accounts share the same alias, a simple UNIQUE(user_alias) will do. More complex constraints can be enforced using plpgsql.

Then name your constraints. Then when the sql layer throws them, it's trivial to remap them to user readable error messages. It works especially well in newer postgres versions where the error reporting is much more detailed and the messages easily machine parsable.

14
dj-wonk 3 hours ago 0 replies      
Q is trying to do both data and validation, which are very different problem spaces. I think designing for both involves serious tradeoffs for the individual parts.
15
Bjoern 9 hours ago 0 replies      
Interesting... here is a tool, which is also going twoards data oriented approach.

"Drake - a kind of make for data"

http://blog.factual.com/introducing-drake-a-kind-of-make-for...

16
mrcactu5 3 hours ago 0 replies      
where is all the theory behind q-lang coming from? are there theoretical principles behind this new schema language?
17
jdc0589 7 hours ago 1 reply      
I can't count the number of times I have wanted to set up re-usable model definition and validation, but haven't been able to find a good portable library that isn't part of something much larger I don't need. This definitely addresses a space in need of more options.

My answer was modlr (https://github.com/jdc0589/modlr), which I have been ignoring, but Q looks like it could do well. I'm not super pumped about the name though, conflicts with super popular promise library lots of people already use.

25
Algebraic and calculus concepts may be better way to introduce children to math theatlantic.com
99 points by tokenadult  8 hours ago   53 comments top 19
1
tokenadult 7 hours ago 2 replies      
I see the article title (which I put on the submission, and one comment commented on) was replaced by an HN moderator with a rewording of the article subtitle. I guess that's okay.

The article was especially eye-catching for me, because it was first posted among our mutual Facebook friends by my mentor in homeschooling mathematics, the mother of the first United States woman to win a gold medal in the International Mathematical Olympiad. The teacher featured in the article as the developer of innovative teaching methods for young learners is also a Facebook friend of mine, and we are part of a network of parents and teachers who are curious about how to stimulate interest in mathematics among young learners by doing things differently from the United States typical school curriculum in mathematics. I'm grateful that my children have been exposed to approaches like that (which I follow imperfectly in my homeschooling my children and in teaching other learners in classroom courses for my occupation), as that has helped my oldest son launch into the adult world with good problem-solving skills for hacking at the startup where he works.

There is a continual tension in mathematics education between teaching topics in their logical order as we understand mathematical topics in light of modern higher mathematics and teaching topics in their historical order of development (which is largely what happens in school mathematics courses). Children need some concrete, tangible experience with counting and with shapes to understand much about mathematics. The cool abstractions that motivate higher mathematics may be inaccessible to children who have no experience with tangible examples of those absractions. (See

http://www.latimes.com/opinion/commentary/la-oe-adv-frenkel-...

an article submitted to Hacker News yesterday, for more about this.) But it looks like we can gain in mathematics instruction by letting children play with more sophisticated representations of mathematics than most children get to play with. Games are great learning tools. One of favorite games for teaching mathematics I learned about from John Holt's book How Children Fail (originally published 1964, which I read in 1971 on the advice of my school's assistant principal). The twenty questions game asks children to find a number from 1 to 1,000,000 by asking twenty yes/no questions, which the person who chose the unknown number must answer truthfully. It is an interesting mathematical exercise to show that twenty questions is (barely) enough for finding one number in one million if the twenty questions are used optimally. I have played this game many times with the children in my classes, and the opportunity to play this game again at the end of class is one of my strongest incentives for the children to stay focused during a lesson on a Saturday morning to get through the lesson content efficiently.

Playing with challenging problems appears to be the royal road for learning mathematics.

http://www.epsiloncamp.org/ProblemsversusExercises.php

I hope ideas like this spread through many communities in the United States and the English-speaking world, so that more young people gain more opportunity to learn to enjoy challenging mathematics early and often.

2
kalid 4 hours ago 1 reply      
(Disclosure: I'll be working with Maria on the calculus series for kids.)

I'm a big fan of the conceptual approach. One of the largest problems I see with math education is that we don't check if things are really clicking.

I graduated with an engineering degree from a great school, and still didn't have an intuitive understanding for i (the imaginary number) until I was about 26.

Go find your favorite tutorial introducing imaginary numbers. Got it? Ok. It probably defines i, talks about its properties (i = sqrt(-1)) and then gets you cranking on polynomials.

It's the equivalent of teaching someone to read and then having them solve crossword puzzles. It's such a contrived example! (N.B., this anguish forced me to write a tutorial on imaginary numbers with actual, non-polynomial applications, like rotating a shape without needing trig. See https://news.ycombinator.com/item?id=2712575)

Calculus needs these everyday applications and intuitions beyond "Oh, let's pretend we're trying to calculate the trajectory of a moving particle."They're out there: my intuition is that algebra gives a static description (here's the cookie), while calculus describes the process that made it: here's the steps that built the cookie. Calculus is the language of science because we want to know how the outcome was produced, not just the final result. d/dx velocity = acceleration means your speed is built up from a sequence of accelerations.

3
bane 6 hours ago 3 replies      
Yes yes yes.

The mechanics of basic calculus are remarkably simple. After a while it becomes a game of sorts with a pretty simple set of rules. Once you learn to "see" parts of the equation as symbol blocks (instead of numbers needing computation), you can move them around freely so long as you follow the rules. Algebra is simply a subset of the rules.

Teaching kids to think symbolically will help them in so many other fields.

I'm pretty convinced that with a little thought, you can teach basic derivation and integration to pretty young kids. Carefully craft the problems to avoid difficult division problems, avoid trig, let them use lookup tables for multiplication and you might be able to get kids under 10 to even do some of this.

Then "vertically integrate" other algebraic and trigonometric concepts into this framework, like adding new pieces to the game.

4
nikhizzle 7 hours ago 2 replies      
I love it. This really appeals to me because of the all the bad ideas I had to unlearn in my 20s:

- math is about numbers and arithmetic operations on them- being good at math meant you were good at arithmetic- some people (meaning me) just didn't have what it took to be "good" at math. Reinforced by my high school math and physics teachers.

I hated math because I didn't understand that mathematics is a system for representing abstract concepts and manipulating them.

Eventually on my 4th try to get calculus, I took a class from nick fiori(http://www.yale.edu/education/about/faculty/fiori.elw062111).

His teaching methods opened my mind, and I've gone on since then to become an ardent amateur mathematician with minor publications and career tangents in machine learning and data science.

I can only imagine what would have happened if I had been taught math well from an early age.

5
yardie 6 hours ago 1 reply      
Does anyone know of any resources to introduce math to a 6yo? I have apps like Dragonbox to introduce algebra to my son. He played with it, completed it, but now is back to the crap IAP games typical on tablets these days. I wouldn't mind a few more like Dragonbox.

I also use flashcards to help with the rote math homework.

6
craigching 1 hour ago 0 replies      
This looks extremely interesting, I have a 5 year old daughter that I want to see the playful side of mathematics before she hits the typical US Maths curriculum. One of my guiding documents has been "Lockhart's Lament" [1], but trying to figure out how to make that a reality for her has been difficult for someone like me with so little time! I am going to check this out tonight!

[1] http://en.wikipedia.org/wiki/A_Mathematician%27s_Lament

7
apalmer 2 hours ago 1 reply      
I definitely appreciate that math is much more than numbers and equations...

On the other hand teaching a kid math has to at some level at some point revolve around taking some numbers and getting other numbers out...

I cant really see how one can do algebra without knowing arithematic. algebra is fundamentally a different beast but you just cant do it without some kinda arithmatic

8
mathattack 7 hours ago 1 reply      
The article headline is a little sensationalist. It's very rudimentary, and it's more about learning limits than equations. That said, games like DragonBox highlight that kids are capable of being introduced to these concepts well before schools get around to it.
9
michaelfeathers 7 hours ago 2 replies      
I know that when I learned calculus the hardest part was reconciling the notion of infinitesimals with algebra. The delta/epsilon definition of a limit was unsettling. It's a shame. Much of calculus is easy to grasp intuitively - Riemann integration is easy to see and explain.
10
ef4 6 hours ago 0 replies      
If you're interested in this kind of work, and especially if you're also interested in using computers to open up new kinds of mathematical play and learning, you should read Seymour Papert's "Mindstorms" book.

The ideas are powerful and radical. Radical enough to explain why run-of-the-mill schools never successfully integrated awesome tools like LOGO and its descendants. They can't deal with learning that's so child-driven and free.

12
cabinpark 7 hours ago 1 reply      
A big secret in mathematics is that many ideas in mathematics are rather simple and straight forward.

Formulating and proving these ideas in a rigorous and logical manner, however, is the difficult part.

13
sudhirj 7 hours ago 0 replies      
http://siterecruit.comscore.com/sr/atlantic/broker.js is holding up the entire site for me. Async, anyone?
14
beat 5 hours ago 1 reply      
I've thought for a long time that an abstract approach might work better than a concrete approach for introducing children to math.

I remember when my kids were in kindergarten, and their math education started with estimation. My first thought was, "Great. Now they've turned math into a touchy-feely all-answers-are-right-answers nonsense subject!"

15
Myrmornis 5 hours ago 0 replies      
"Revolutionizing the way math is taught" and not "memorizing multiplication tables as individual facts rather than patterns" sound like very worthy and important goals. At the same time, it's hard to judge this particular initiative based on the article; it might be Hippie Math for Rich Kids Who Will Study Humanities Subjects and Become Trustafarians by a Berkelely Yoga Instructor"
16
NAFV_P 4 hours ago 0 replies      
> But they also need to see meaningful (to them) people doing meaningful things with math and enjoying the experience.

I was wondering if this was essential for the learning process of five or six year olds.

Making maths enjoyable for young children would be the primary concern, but showing something meaningful (not necessarily to them) being done with maths by someone meaningful (to them)? It would work better if:

The kids require to see something that is relevant to their interests or needs being solved with maths by any old codger, as long as said codger can adequately manipulate algebra.

By the way, who is going to teach these children algebra? A fair number of teachers won't be up to the task. Reminds me of the learn to code buzz-phrase being passed around - first off you need to spend money and time on teaching teachers how to code.

EDIT: bloomin' asterisks.

17
fdej 7 hours ago 0 replies      
Obligatory Star Trek reference: http://www.youtube.com/watch?v=ETt8GJRbqLc
18
chrisBob 6 hours ago 0 replies      
Am I the only one that enjoyed math more earlier than now? I loved calculus and geometry in high school and undergrad, but now in graduate school I can't follow the physicists at all, and my eyes just glaze over.
19
vasundhar 3 hours ago 0 replies      
There was a study done in a place quite unexpected to me now, and interesting back then when I was in school myself.

My Math Teacher told there were some schools where a pilot was conducted teaching calculus concepts from 5th,6th std students and were monitored till they did their graduations and career paths.

Interestingly the people who were chosen randomly, performed equally well and their decisions in addition to math were more logical.

This study was done in India :)

26
Mopidy - server which can play music from multiple sources mopidy.com
4 points by dz0ny  14 minutes ago   discuss
27
TLS Triple Handshakes imperialviolet.org
58 points by zdw  6 hours ago   11 comments top 4
1
tptacek 5 hours ago 3 replies      
This is extremely cool. I am going to butcher this.

At its root, the attack weaponizes unknown key share attacks by deploying them in the context of a TLS MITM. UKS attacks are a well-known but infrequently considered protocol vulnerability; the best documentation I know of them (thanks to smarter friends who pointed this out to me) is the STS paper from '99:

http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=33A...

The basic idea is that protocols assume that if Alice and Bob agree on a key, Alice and Bob must both believe they've authenticated to each other. In reality, Mallory can often exploit that assumption. For instance, Mallory can negotiate a session with Alice and Bob, arrange to have the sessions arrive at the same key, and then replay Alice's messages on the connection to Bob.

The kernel of the attack in this paper is a UKS attack that exploits the differing assumptions of two different session authentication schemes that coexist in TLS. The first is the ordinary public-key exchange that people assume happens on every TLS connection. The second is the session resumption form, which skips the public-key exchange. The problem is that the first form is cryptographically bound to the certificates on the connection, but the second form is bound instead to the shared state that is computed by both sides of the connection and not explicitly to the certificates. An attacker can MITM Alice and Bob, synchronize their keys (by waiting for Alice to compute a key and send it to the MITM, unwrapping it, and resending it to Bob), kill the session, then MITM the session resumption that happens when Alice reconnects; the attacker has laundered their involvement in the session.

Aside: to me, the funnest part of the paper is the synchronization/UKS attack that happens in classic DH TLS (which is rarely used). Instead of unwrapping and resending Alice's key, Mallory can trick Alice into accepting a bogus DH group which reveals her key to Mallory. The researchers found that WinAPI' SChannel and Chrome/Firefox NSS were susceptible to malicious DH parameters, as were 12% of the top Alexa servers. Wee!

Why "rarely used"? Because most DHE TLS connections are Elliptic Curve DH, not classic DH. Elliptic curve groups are much scarier to validate than classic DH groups (where you really just care that the DH modulus is large and prime and the generator generates the whole group --- though ironically the authors found that this doesn't reliably get validated). As a result, TLS stacks require specific ECDH groups; you can't play as many games with them.

The downside (to me, not to you) is that the attack has limited impact. Basically, what this UKS attack mostly does is revive a session resumption attack Marsh Ray found a couple years ago. Alice can connect to Bob; Bob demands a client certificate from Alice. Mallory wants to connect to Bob, and so makes a connection to Bob, sends data, and then renegotiates TLS with a handshake replayed from Alice and Bob. Mallory can't read those messages or, for that matter, anything subsequent on the connection, but Mallory's data may be accepted by Bob, despite it not being authenticated.

'cperciva, 'pbsd --- I've given you a lot to work with here. Tell everyone else here how I got this wrong! :)

2
zobzu 5 hours ago 2 replies      
I'll just throw this out there:

Firefox is AFAIK not vulnerable to this.

3
jessaustin 5 hours ago 0 replies      
Hopefully the image below is broken. If not, then it likely will be soon because of a browser update.

Yay it's broken! I can't help but be impressed by this.

4
moron4hire 5 hours ago 0 replies      
I'm sorry, this is completely OT, but I can't help it:

Is that anything like McDonald's Triple Thick Milkshakes?

EDIT: I REGRET NOTHING!!!

28
WWII Bugatti 100P Plane Rebuilt ibtimes.co.uk
25 points by x43b  3 hours ago   13 comments top 4
1
beat 2 hours ago 3 replies      
It's a lovely project, but I wouldn't be so sure it would "win the war". Raw speed and aesthetics sometimes take a back seat to other considerations. For example, what was its range? Its ammo load? Its durability and aerodynamics when damaged? Its reliability?

If the plane doesn't have the range to get to the battle, fight the battle, and get back home again, it will fail. If it dies the moment it is grazed by some flack, it will fail. If it runs out of ammo while dogfighting, it will fail. If it's too unreliable to stay in the air without excessive maintenance, it will fail. If it costs twice as much to manufacture as a slightly less effective plane, it will fail. Et cetera.

In WWII, there was something of an arms race for the largest, most heavily armored and heavily armed tanks. Germany developed the Maus, and Great Britain developed the Tortoise, and the US had the T-28. None ever saw battle. They were too expensive to build and too difficult to transport to the battlefield.

I suspect the Bugatti Veyron fighter might have been another case of the same problem, had it ever made it to production - overoptimizing one sexy feature at the expense of other, necessary ones.

2
cromulent 3 hours ago 0 replies      
"Jet Fighter that Could Have Won Battle of Britain for the Nazis"

Jet?

3
goodcanadian 2 hours ago 1 reply      
Already pointed out, but not a jet. Also, the requirements for a racing aircraft and a fighter are different enough that it is not at all clear to me that there is any relevance to military applications in this case.
4
lotsofmangos 1 hour ago 2 replies      
Lovely plane, shame about the article.

The Germans already had the fastest plane, the rocket powered Messerschmitt Me 163. It was so fast that it was completely impractical in dogfights against slower planes that had tighter turning circles. Also, it kept blowing up on landing. And take off. And any time anyone looked at it funny.

29
Show HN: Awesometalk Free video calling without the hassle awesometalk.com
32 points by brendanib  5 hours ago   15 comments top 5
1
mholt 1 hour ago 0 replies      
Awesome! I just tried it from a university campus to a business ISP (both very high bandwidth) and the quality was crystal-clear.

Two immediate requests:

1) Screen sharing

2) Group chats

Definitely loving that there's no other software or installations required. Looking forward to its further development.

(Edit: Already got an email from the developers, and I understand the security limitations of easy screen sharing, so I guess just do what is possible; I'm not asking for the impossible. Just easier than installing some full-blown software, if possible, would be great.)

2
waldir 30 minutes ago 0 replies      
This only seems to work for two people at the moment (i.e. no group calls), but for that use case, it seems promising.

As it happens, for the past few days I've been trying out a lot of tools for online video/audio conferencing, and made a summary of their features here: https://docs.google.com/spreadsheets/d/1C1gAWPBmAWsQEo78ysds... (I didn't include Awesometalk because I'm only looking for group meeting tools, not one-to-one video chat)

3
brendanib 4 hours ago 1 reply      
Hey, I'm one of the co-founders of Awesometalk. We started working on this 2 weeks ago and I'd be happy to answer any questions you have.
4
zachlatta 4 hours ago 1 reply      
How is this different than https://appear.in/?
5
superduper33 2 hours ago 0 replies      
Screenshare pls
30
Kickstarter passes $1B in pledges kickstarter.com
177 points by mecredis  13 hours ago   60 comments top 18
1
basicallydan 10 hours ago 2 replies      
Congratulations, Kickstarter.

In my opinion, Kickstarter is one of the most important tech startups of the last few years for democratising the process of project funding so significantly. It's started to change the way people do business and create things in such a fundamental way.

One of my favourite examples to give is the way in which games can now be funded. The truth of the matter is that games can cost a lot more than they used to, but for the last 15 years Publishers have had such a strong influence over which games do or do not make it to market and they know that a mass market game is going to be less risky than something like Double Fine's Broken Age. So they probably won't fund it, and I can kind of see why.

Now thanks to Kickstarter et al., a developer just has to be able to say to potential fans, "is this a game you want?!" and if they say yes, they essentially pay in advance for their game. It's so direct, it's so wonderful.

Yes, there's still an element of having to sell the idea, but at least the idea is being sold to a bunch of regular folks, who have a little bit of money as opposed to somebody in charge of a large business. The decisions are much easier to make, and the risk is much lower for the investors.

I hope crowdfunding in this way isn't just a flash in the pan. I hope we continue to see projects funded in this way for a long, long time to come. Here's to another billion, Kickstarter!

2
jonknee 9 hours ago 4 replies      
That's pretty impressive. Kickstarter has a 5% fee (and Amazon their own processing fee) which means ~$50m in revenue for Kickstarter and $25M in revenue over the last year. I wonder how long they'll keep with Amazon as payment processor, it seems like they're leaving a lot of money on the table.
3
BillyMaize 2 hours ago 0 replies      
I just found out that my uncle, whom I thought very successful, failed to fulfill his responsibilities to his backers. He was working on a 3d printer that I was hoping to get to buy some day, but he never delivered more than a few of them (most of which didn't even work) and now his backers are trying to get together to sue him. A few days ago he filed for bankruptcy so they probably won't get anything.

As much as I love the idea of kickstarter, I have now seen first hand how you can be cheated and just can't trust the system to work (although many have and it has worked for them). After the disappointment of Minecraft when Notch started making a ton of money I have simply stayed away from all crowd sourcing/buy-and-play-during-alpha projects for good. There is nothing wrong with simply waiting until something is finished to buy it.

4
toong 11 hours ago 1 reply      
I skimmed over the map to find some interesting data:

* Averages are close to $200 per backer

* USA has $175 avg per backer, but makes 2/3 of that $1B

* The middle east (UAE, Saudi Arabia, Koeweit, ..) spends $400 up to almost $800 per backer

* Antartica has 11 backers @ $337/backer :-) 4th place, after the middle east, After that, Norway ($280), Belgium ($250)

Would be interesting to plot this data against the number of citizens, so you can get a view of the participation rate of a country. (The US would top that probably at around 1%)

Edit for readability

5
famousactress 8 hours ago 0 replies      
Tangential.. Kickstarter mentioned influential contributors but for whatever reason didn't give links to their backed project and I was particularly curious about what projects Neil Gaiman had backed. They weren't kidding, he's quite the prolific backer! https://www.kickstarter.com/profile/108204027
6
samwillis 12 hours ago 3 replies      
I would love to know what the successful payout total is.
7
Grue3 10 hours ago 2 replies      
Wonder how much of this 1 billion actually successfully resulted in backer rewards and wasn't indefinitely borrowed/outright stolen. Just recently a webcomic artist John Campbell who raised $50000 wrote a long screed telling his backers that they will never get the book they paid for. And a lot of projects get completely forgotten about after they get fully funded.
8
crypt1d 10 hours ago 2 replies      
These stats can actually be quite useful if you ever decide to create a project on Kickstarter. For example, most of the money is granted on Wednesdays and between 10 and 15th of the month, so you can plan your funding cycle accordingly and increase marketing efforts during this period.
9
rplnt 11 hours ago 0 replies      
The share buttons at the bottom don't work.
10
greyshi 12 hours ago 2 replies      
That's a pretty website. I was impressed by the animations. What is the term for these kinds of slideshow-like sites?
11
yohann305 8 hours ago 0 replies      
It is very interesting to observe that countries who spend the most on kickstarter are the same ones as the ones which spend the most in the Apple app store.
12
quarterto 11 hours ago 1 reply      
"A million dollars isn't cool. You know what's cool?. A billion dollars."
13
mpg33 5 hours ago 3 replies      
When will we see crowd equity sites? Ie invest in early startups and get equity in the company.
14
salmiak 6 hours ago 0 replies      
Love the price comparison - what you can get for $1.000.000.000.

Also amazed that more money where backed from Antartica then some African countries.

15
mastersk3 11 hours ago 2 replies      
The country wise breakdown is fantastic, roughly indicates the startup culture present in the countries
16
vrikis 12 hours ago 1 reply      
That's actually really, reallyyyyy impressive... Congrats to Kickstarter.
17
jokoon 8 hours ago 0 replies      
I live in france, and kickstarter doesn't seem available here :(
18
Thiz 8 hours ago 2 replies      
Off topic but looking at the map, it validates the old libertarian axiom that the more liberties, the more prosperous the economy, the more donations for utilitarian causes.

Who will feed the poor? Liberty will.

       cached 3 March 2014 23:02:01 GMT