hacker news with inline top comments    .. more ..    29 Sep 2012 Best
home   ask   best   5 years ago   
1
Bret Victor: Learnable Programming worrydream.com
981 points by siavosh  2 days ago   179 comments top 2
1
bretthopper 2 days ago  replies      
There's already two comments here about being "harsh" or "ungracious" towards Khan Academy which is ridiculous.

The usual HN article that contains criticisms is usually limited to that. Some rant that took 10 minutes to write and contains nothing constructive.

Bret Victor put an insane amount of time into this (I can only assume) and is truly advancing mindsets about programming tools and learning. We should all be thankful that he put this much effort into a "criticism" piece.

2
scott_s 2 days ago  replies      
Programmers, by contrast, have traditionally worked in their heads, first imagining the details of a program, then laboriously coding them.

I don't think this describes most real work done by programmers. Rather, what he says we should do,

To enable the programmer to achieve increasingly complex feats of creativity, the environment must get the programmer out of her head, by providing an external imagination where the programmer can always be reacting to a work-in-progress.

Is exactly what most programmers already do. We usually don't have a nice, interactive environment to do so; it's usually a combination of coding, inspecting results, thinking some more about new issues, coding, inspecting results, on up until the problem is solved.

In other words, I think that programmers do tend to solve problems by "pushing paint around." I rarely start with a full appreciation of the problem. But in order to gain that understanding, I have to start trying to solve it, which means starting to write some code, and usually looking at results. As I go through this process, the domain of the problem comes into focus, and I understand better how to solve it.

We already do what Bret is talking about, but not at the granularity he is talking about it. For beginners, I can understand why this difference is important. But I certainly solve problems by pushing paint around.

In general, I think this is a fantastic piece for teaching programming, but I don't think (so far) that all of it carries over to experienced programmers. The examples of having an autocomplete interface that immediately shows icons of the pictures they can draw is great for people just learning. But that's too fine-grained for experienced programmers. Chess masters don't need to be shown the legal moves on a board for a bishop; their understanding of the problem is so innate at that point that they no longer play the game in such elementary terms. Similarly, experienced programmers develop an intuition for what is possible in their programming environment, and will solve problems at a higher level than "I need to draw something." That is the reason we put up with existing programming environments.

2
Meeting A Troll traynorseye.com
665 points by Liu  4 days ago   200 comments top 2
1
nhashem 4 days ago 2 replies      
For what it's worth, if you're the target of especially terrifying harassment like this, it's important to keep two things in mind: the more targeted the harassment is, the easier it is to identify the harasser, and the more likely it is that there's some sort of personal connection.

If you're receiving stuff in the mail (like the lunchbox with ashes) or stuff literally just dropped on your doorstep (the dead flowers), it's extremely unlikely that someone on the internet you have absolutely zero connection with just randomly picked you to ruin your life that badly just for some lulz.

I'm glad the OP was able to take the steps he needed to identify his harasser, which I'm sure was a lot more empowering than living in fear and checking his door locks every night.

2
yelsgib 4 days ago  replies      
For those of you who claim that the right thing in this situation would be to turn the child over to the authorities - what exactly do you think the authorities are going to do to make this situation better? The child obviously has some darkness that he needs to work through - to work through darkness he needs the support of a loving community. I find it very sad and disturbing that some commenters (who I assume are adults) believe that the right thing to do is to hand a child like this over to the police or mental institutions. This idea that the police are some sort of magical wand that you can wave at problems to make them go away is at the center of our (our here meaning the US & Britain's) social decay (c.f. the current treatment of drug abusers, ethnic minorities, and the "mentally ill" in the United States).

Assuming this story is true, what OP did was the right, human, adult thing to do - to treat the child as a human being capable of change and growth and to see to it that the community accepted him and moved him towards change. Concepts like "justice" and "psychosis" are easy to throw around and are very practical, but their use is typically the root of more harm than good.

3
A Letter from Tim Cook on Maps apple.com
522 points by j4mie  15 hours ago   447 comments top
1
robomartin 12 hours ago  replies      
What to say? The fact that such a public letter had to be issued means that there's a lot of push-back. Apple just doesn't do that. In fact, I don't remember any software company doing this. I could be wrong. This feels unprecedented.

Not one person posting on HN and the many blogs really knows what happened behind the scenes. Apple engineers are not known for being dumb. Someone had to know that Maps was a bad idea. A huge step backwards. They had to know.

So the question might very well be: Why did they do it?

This couldn't have been out of spite. Just to kick Google off the platform. One just doesn't do that. Maintaining a complex code-base such as iOS is difficult enough. Adding to that the friction of delivering a substandard product is not something one does without very good reasons.

Conjecture is all we have from the outside. My humble guess is that it had to come down to a business deal they did not want to make. The details of the deal are not important. Who was right and who was wrong isn't important. What is important is that whatever they had in front of them convinced Apple management that it they had no choice but to, effectively, downgrade the next release with Maps.

I already know of a lot of non-tech people, particularly outside the US, who are livid about Maps. After dutifully upgrading their devices to iOS 6 they discover that Maps are, in their words, "crap", "useless", "unreliable", "a joke", "not accurate", "una mierda" (shit), etc. The reason for the strong feelings is that, let's face it, if a good tool such as Google Maps is available to you, you might tend to use it.

And a lot of people would use it all the time. My own wife relies on Google Maps all the time. Thankfully she was wise enough to marry a geek who promptly told her not to upgrade her iPhone 4S to iOS 6 and not to swap it out for an iPhone 5. In fact, not one person in my family will do either of those things. And that is the case --that has to be the case-- for millions of people at this point.

This is the data we are not getting and that Apple will probably never release. I own eight iOS devices. Not one of them will be upgraded to iOS 6. In fact, the upgrades stop here until either Maps starts to get really good marks. And, of course, we probably would have purchased at least three iPhone 5's. Not happening. I'll get one for development but it will not be activated.

How many millions are in this boat? If someone is a heavy Google Maps user it makes no sense to get an iPhone 5. What's wrong with a 4S? Nothing. Use their website you say? Not the same, most would say.

As a developer there's a lesson that needs reinforcing every-so-often. What better way to reinforce it than to see a tech giant make some of the mistakes lesser companies make: If you can at all help it, don't base your product on someone else's technology. Don't make someone else's technology such an important part of your offering that not having them will hurt you. Of course, sometimes you have no choice.

As a user and a developer I view iOS 6 as a significant, if not huge, step backwards. Between Maps and the eviscerated app store one has to ask that cliche-ish question: What were they thinking?

Wouldn't we like to know.

4
Tesla reveals Supercharger network, Free Fill Ups engadget.com
451 points by dave1619  4 days ago   269 comments top 4
1
kamaal 3 days ago  replies      
The arguments in this thread are amusing to say at the least. 30 minute charging times, high prices of Tesla cars. Yes all that is true. But this was never about 'Electric cars are awesome now'.

I'm pretty sure cars during and before pre Henry Ford times were not very great in terms of their overall affordability, total cost of ownership and availability of fuel all around the country. You could trust your horse to drive you back home on any day more than a car. Similar to that, the IBM computers during their early days. You mobile phone is likely to have more computing power than all the computers IBM sold a few decades back, they were highly painful to use, maintain and use. Needless to say all these things had huge maintenance issues.

But these things have caught on. So have automation, productivity and so many other new things that eventually people show friction towards but later take them to be fate accompli and learn to move forward.

Electric cars, self driving cars, wearable computing(like Google glasses) well these really might look to be unfeasible at this time. But please, these are just ideas which are waiting for their time to come, with a little push they will eventually catch on.

2
sixQuarks 3 days ago 5 replies      
I would say Elon Musk is the greatest entrepreneur of the past century - even more so than Steve Jobs. He simultaneously created three separate companies, all in extremely complex industries, and combined all of them into one overall strategy.

The precision manufacturing they learned with Space X is incorporated into the Tesla S. Their aim is to make the Model S the most reliable and problem-free vehicle due to this precision.

Now they are incorporating SolarCity technology into the entire system. This is absolutely brilliant!

3
revelation 4 days ago 1 reply      
Ugh, tech reporting. Is there a original source for this somewhere? Theres some limited info at http://www.teslamotors.com/supercharger.

100 kilowatts good for three hours of driving

From the announcement video, it seems they are charging at 370V with a maximum of 225 amps = 83.25kW. So that 30 minute figure for half a charge is pretty real. The cost for one such charge would be approximately 3$ with industrial prices for electricity. Given that you can feed solar power back into the grid at rates above or equal to what normal customers pay for their electricity, I could see them making break-even on this when discounting the initial investment (which will pay back hundred-fold in adoption rate for their cars).

4
jerrya 4 days ago  replies      
Is it outrageous to think that eventually this "gas" will be free because outlets will compete to have Tesla and other electric customers stuck at their restaurant, bar, store for a half hour?
5
Announcing the First Beta Release of Persona mozilla.com
445 points by callahad  1 day ago   182 comments top 6
1
Osiris 1 day ago 1 reply      
I just tried it on the Times Crossword page. The workflow is really simple and elegant. I put in my email address. It took a second to determine there was no Persona account, then asked for me to create a password. After that, I clicked on an authorization link in my email account and as soon as I did that it immediately logged me in. I clicked Log out and back in again and it immediately recognized me and logged me in.

This is really what I was hoping to see with OpenID when it came out, but the process to set up an account and get started is much more cumbersome.

I look forward to seeing native support for Persona in browsers.

2
binarymax 1 day ago 1 reply      
Have been messing around with BrowserID since it first went public last year (with a node.js backend) and I love it. Its so much more straightforward to implement than oauth and openid, and the fact that it's tied to your email address is perfect. I'm definitely going to be using it as my primary auth system going forward. Great job Mozilla!
3
Too 1 day ago 5 replies      
After reading the text twice and watching both videos i still have no clue what it actually does and how it solves the problem. I'm a few pages of skimming into the documentation now but there's no overview of what it actually does in the background.

Just a load of buzzwords and awesomeness!1 of how this will revolutionize my account management and how easy the API is.

Is it a password manager, a biometric system or some kind of account provider?

4
JakeSc 1 day ago 4 replies      
My major concern with this, beside the eggs-in-one-basket issue, is that this places even more value on my email account.

Years ago, my email account was simply used for exchanging short pieces of text with acquaintances and companies. Now it's the central key to all my authentication sessions and finances, and therefore presents a huge target for attackers.

I've been looking for ways to reduce the risk associated with losing access to my email account, should that ever happen. Yet for all its benefits, Persona still places yet more importance on protecting my single email password.

5
BryanB55 1 day ago 4 replies      
This seems to be a nice solution if you are on your own home/work computer and have your email open. They didn't really explain much on HOW it works but the problem I'm seeing is that if I am at a public computer and want to login I have to log in to my email account first and click on the persona link. I guess the benefit here is that I only need to remember 1 password (my email address password) but my email password is usually a 50 character random string that I don't like entering on public computers if I even could.

So if I want to log in to a crossword puzzle I almost feel like I have to compromise my email password which is much more valuable, if say the public computer has a key logger or something.

Maybe I'm over thinking. I could see how this would be useful if I have my desktop mail client running and just click a link to log in though.

6
forgotusername 1 day ago  replies      
I really want to believe in something like this, however you'd getting much better traction by explaining a few key details:

* What the hell does the JS assertion object look like?

* How do I run an independent service?

* In a single page, walk me through the steps to integrate?

Videos, dodgy music, overenthusiastic PFYs appeal to me much less than good documentation

6
StarCraft: Orcs in space go down in flames codeofhonor.com
404 points by phenylene  1 day ago   84 comments top 7
1
T-R 1 day ago 1 reply      
For anyone who's wondering:

Mode 7 was the SNES graphics mode that allowed for things like rotating and zooming a background layer - used for things like the track in Mario Kart and F-Zero, or the worldmap in Final Fantasy 6 (while you're in the airship). The Scroll Register was used for scrolling in Mode 7.

H-Blank is the horizontal blanking period (and associated interrupt) - a time period between the drawing of scanlines on the screen (there's also V-blank, between frames). Changes made during H-Blank could make for some interesting effects - it was used for things like the circle that closes around Mario at the end of a level of Super Mario World: The rectangle draw routine is used, but the size of the rectangle is changed between scanlines, creating a circle. I'd imagine this was used for some of the wavy distortion effects in games like Chrono Trigger and Earthbound as well.

2
SoftwareMaven 1 day ago 6 replies      
I've worked at two game studios over the years and came away with one takeaway: there won't be a third unless it's my own. There are few more dysfunctional engineering environments on the planet, leading to continual burnout.

There is a long line of coders who grew up playing games that want to do that into adulthood, leading to a perverse supply/demand ratio that allows studios to treat their employees like crap under the auspices of "that's how the industry works". I wouldn't buy into it.

3
wtallis 1 day ago 1 reply      
"As bad as Ion Storm was internally, there was a dark secret that eventually unraveled. It wasn't until years later, well after the 1996 E3 demo of Dominion Storm, and after StarCraft launched, that we discovered that the Dominion Storm demo was a fake."

How many times in the history of computing has a team seen a faked demo, believed it, and cloned it, unwittingly becoming the first ones to do it for real? The fact that there are several such stories is really quite amazing.

4
danso 1 day ago 0 replies      
Ah, the press event, the time-suck that had to be tolerated in every realm of human undertaking: video games, sports, business, politics, etc

> As every game developer knows, release dates are slippery, but the dates of trade shows are set in stone. If a game studio has spent hundreds of thousands of dollars to prepare booth space, purchase long-lead print advertising and arrange press appointments, the development team is going to have to demo something or heads will roll.

It's crazy to think about how much money and resources were wasted, not to mention destructive pressure created, by these contrived schedules of publicity dates. Getting publicity today is not as simple as making a webpage and twitter account, but at least it's not how it was in the OP's day

5
STRML 1 day ago 2 replies      
Blizzard is often revered as one of those few studios, like Valve, that operate on "when it's done" time - game releases happen when they're ready, not when some publishing house requires it. As a result they have been monumentally successful.

It is interesting to hear that this was not always the case. As graphics have gotten better, storage has gotten cheaper, and budgets have gone way up, studios can't just pump-and-dump franchise cash-ins and casuals quite like they used to (with the exception of smartphone titles). ION Storm did Blizzard a great favor by wounding their pride and motivating them to create one of the greatest games ever - and to continue that brilliance until the present day.

6
xentronium 1 day ago 3 replies      
Speaking of change of plans. Does anybody remember blizzard warcraft 3 pre-release version? See screenshots [1][2]

That was entirely different gameplay, with more RPG and less strategy. I remember how I read about it in some magazine and was greatly excited. When it came out, I liked it even more than I expected, that was one kick-ass game.

And then I found World Editor. Needless to say, I was stunned, I spent all my free time playing with JASS (wc3 scripting language), and that was very probably a deciding factor in me becoming a programmer. Hell, even now, I think I could make a decent map if paired with a good landscape designer/storymaker. I haven't finished many maps and projects in my time but the process of creation/programming was so incredibly enjoyable, that end result didn't even matter.

I wonder if there are any other world editors on HN.

[1] http://www.scrollsoflore.com/gallery/displayimage.php?album=...

[2] http://www.scrollsoflore.com/gallery/albums/war3_prerelease/... also click arrows on the page, there is more.

7
comlag 1 day ago  replies      
The link to the story of ION Storm and the game Daikatana he mentions was a really interesting read as well. Talk about a complete mess of a company.
http://www.dallasobserver.com/1999-01-14/news/stormy-weather...
7
Subtle Patterns subtlepatterns.com
398 points by benologist  2 days ago   49 comments top 32
1
krogsgard 2 days ago 1 reply      
Subtle Patterns has been around for a pretty long while now. It's a great site. My designer colleagues use it quite a bit.

The only thing I wish it did was show the pattern size by default in the description. Some of them show it, but not all. And in my experience they vary a good bit.

Also, on the Github page, there is a .pat zip of all the patterns which makes it even better https://github.com/subtlepatterns/SubtlePatterns

2
aw3c2 2 days ago 1 reply      
What license are the images? There is a license mention on the bottom of the website but to me that seems to cover the website only, not the downloaded images. It would be great if each zip would include a proper license.
3
JonnieCache 2 days ago 1 reply      
For something less subtle see http://www.patterncooler.com/

Don't miss their ultraswanky svg editor thingy! http://www.patterncooler.com/editor/

4
mrchess 2 days ago 2 replies      
Nice to see they got their domain back.
5
yenoham 2 days ago 1 reply      
I had thought he'd given up on the site the other day, and just started using the git repo instead. Glad it was just a blip instead.

Hopefully he'll blog about what happened.

6
bobbles 2 days ago 1 reply      
Is this your site?

The first texture looks like it's broken because there is no border. I thought it wasn't loading at all, but it was just the exact same as the background on the site.

Edit: nevermind.. I must have clicked directly on the preview button when i changed focus or something.

7
mcormier 2 days ago 0 replies      
This is the kind of website that makes me want to go redesign a website for no reason than to fiddle with these patterns.
8
olalonde 2 days ago 0 replies      
9
ipince 2 days ago 0 replies      
Wow... they're so subtle I completely missed them.

I thought the page was broken for mobile. And then it took me a second pass to realize it was in fact not broken on desktop too!

10
vindicated 2 days ago 0 replies      
It's a real gem, especially for someone who isn't as aesthetically astute, like me. I used it on a couple of websites I've been working on recently. [1][2]

[1] http://tweetfad.com/

[2] http://db.uwaterloo.ca/dmc2013/

11
barrkel 2 days ago 0 replies      
I'm strongly reminded of the old Windows 3.1 tiled patterns.
12
Gravityloss 2 days ago 0 replies      
Some of these are quite strong and contrasty and very repetitive in a small scale and thus slightly migraine inducing.

With the horizontal repetition they also mess with the angle of your eyes / distance perception (as there are multiple transpositions where they correlate perfectly).

I like subtle, very random and smooth patterns a lot.

13
wingerlang 2 days ago 0 replies      
I found a chrome plugin that tries the patterns on any website quickly. Pretty handy while choosing something I guess.

https://chrome.google.com/webstore/detail/cnhhinfdmnakglphga...

14
ojilles 2 days ago 0 replies      
Related, if you wanted to make your own patterns (and then perhaps upload to subtlepatterns.com :-)) check out this iphone app: http://patternshooter.com/
15
gere 2 days ago 0 replies      
Sorry to go a little off topic, but I'm wondering if someone remember a background pattern/texture generator posted here on HN a few months a go. I lost the bookmark. I only remember it has a dark background. Thanks.
16
lurker14 1 day ago 0 replies      
These all make me think my monitor is dirty. Why is this a feature?

Also, why are they called "patterns" and not "textures"?

17
madrona 2 days ago 0 replies      
I think I need a monitor with better color fidelity. Half of these look like whiteness.
18
debacle 2 days ago 0 replies      
I really like all of these. They just feel good.
19
nachteilig 2 days ago 0 replies      
It's really awesome that they thought to include retina versions. I really hope this becomes the norm, esp. with bootstrap etc.
20
zem 2 days ago 0 replies      
i don't know if it's my monitor or colour settings or what, but some of them don't seem to do anything when i click the preview button (e.g. "straws" and "swirl" from the current front page)
21
Lerc 2 days ago 0 replies      
Great, now I have to clean my monitor.
22
salimmadjd 2 days ago 0 replies      
I've been using a it for a while. Downloading their PSD patterns make your design life a lot easier.
23
srik 2 days ago 0 replies      
They also include a "@2x" version for retina displays. Pretty nifty.
24
jpadilla_ 2 days ago 0 replies      
It'd be cool to read about what did they have to do after it was hijacked.
25
JacobIrwin 1 day ago 0 replies      
Thanks for posting this excellent free resource - bookmarked for definite future use :)
26
jmharvey 2 days ago 0 replies      
"Back Pattern" flickers horribly when I scroll. Which makes me wonder whether the other patterns will have a similar flicker on other displays.
27
dutchbrit 2 days ago 0 replies      
Awesome to see it back again!
28
thejosh 2 days ago 0 replies      
dot_clean would come in handy before uploading files from a mac.
29
mylittlepony 2 days ago 0 replies      
This is fantastic, thanks!
30
abrichr 2 days ago 0 replies      
Beautiful, thanks.
31
iamjason89 2 days ago 0 replies      
these are great. thanks
32
iamjason89 2 days ago 0 replies      
these are a great set. thanks
8
Jetstrap for Bootstrap jetstrap.com
396 points by yesimahuman  3 days ago   147 comments top 18
1
mcobrien 3 days ago 2 replies      
I tried this out earlier, just before you sent the beta email out. A few thoughts:

1. Keyboard shortcuts! I really wanted to deselect items by hitting escape but it didn't work. I kept trying anyway because I'm so used to Balsamiq :) I'd also love to be able to hit / to quick-search the list of elements I can add, then return to add the selected item. Again, Balsamiq really nails this.

2. I wanted to add a div with Bootstrap's pull-right class, but couldn't figure out how. Some means to add custom elements with custom css classes, even in a limited way, would really help me.

3. I typed some text (directly into a grid .row) but it didn't show up in the CSS+HTML tab. Not sure if this was a bug or not.

I really want this to work and it would save our team loads of time (meaning, we would pay for it :). Looking forward to what you do next.

2
DanielBMarkham 3 days ago 1 reply      
Sweet spot between heavy IDE and notepad. I like it!

I plan on using this on my next site. One question, though: is there a way to know which version of bootstrap and such the editor is using?

3
itsprofitbaron 3 days ago 2 replies      
I noticed the "Try Jetstrap free today!" then I saw you mention "Our first thought was $10/mo for 100 screens"[1] which I think is not only underpriced but the wrong pricing structure.

You should implement something like this:

- User signs up - gets 5 - 10 FREE Screens.

- Once they've used up their FREE screens take them to a page to purchase more screens.

- When they have 1-2 FREE Screens remaining you should have a popup which says they've nearly used their free screens up and to buy more (with one option offering to go to payment page & the other leting the user continue what they were doing).

- On the page where the users can purchase more screens, I wouldn't bother with a monthly subscription plan because that's not what your target audience is interested in.

I'd offer the following options:

100 Screens - $39 ($0.39/Screen)

250 Screens - $95 ($0.38/Screen)

500 Screens - $180 ($0.36/Screen)

[1] http://news.ycombinator.com/item?id=4572125

4
tomasien 3 days ago 2 replies      
Mwahahaha my plan to become a professional web designer without learning how to write any code is working!
5
neya 2 days ago 2 replies      
My suggestion as a UI/UX guy: Refine the call-to-action buttons by assigning them a different color. Black is not really an attention seeker. I would suggest something like a bright Green or pale Blue for the buttons (for the Sign up buttons on the top and the bottom). Hopefully this will improve your conversion rate.

Just these, otherwise excellent. Keep it up guys!

6
aaronblohowiak 3 days ago 1 reply      
This looks like one of the most sensible WYSIWYG web page builders. Thank you for the minimal permissions to sign up with twitter. Might I recommend that you add the preview for the element when you hover over the component on in the tray? Nothing happens when I drag and drop things either -- latest chrome release channel.
7
codegeek 3 days ago 1 reply      
I can only sign up using twitter, google or github ? I have these accounts but prefer not to use them. Any plans to offer generic login ?
8
justindocanto 3 days ago 1 reply      
I love this tool. As a programmer I cringe at most WYSIWYG because they heavily dilute the code needed to make something happen + hardly ever get it done in the first place. This actually looks very promising.

One note though. Your target audience is going to include non-programmers and those people might not even know the proper names for some of these bootstrap elements. So I suggest some kind of toggle between the boostrap name buttons & maybe some sort of GUI. like Actually dragging a menu into the workspace, or dragging a twitter bootstrap button into the workspace, etc.

Other than that.. I like where this is going.

9
erikano 3 days ago 1 reply      
Sorry for being blunt, but the site looks quite bad on a mobile device [1][2][3].

I realize that this tool is probably meant to be used from a computer with a large screen and with a mouse, but consider the following scenario: Somebody, who is using their smart phone, is landing on your page from a search for "bootstrap responsive interface-building tool". If that was me, I'd probably leave the site pretty fast.

[1]: http://i.imgur.com/UsYhZ.png?1

[2]: http://i.imgur.com/UzaB8.png?1

[3]: http://i.imgur.com/JEPSj.png?1

(Using Dolphin Browser on Samsung Galaxy S2, Android 2.3.5)

10
jeffpersonified 2 days ago 0 replies      
I think this is a great tool, and desperately wanted it to work. In fact, decided to try it out on a project I'm currently working on. That said, the autosaving has been causing me problems all morning. It's unclear when it saves, and because it seems to lack a manual saving feature I was unable to save the last 30 min of work. All lost after reloading the page reverted to the version I started with. Very frustrating.
11
ianlevesque 3 days ago 0 replies      
Can we get a Sign Up with Email option? I don't like linking together other login services with sites I'm just trying out.
12
marcamillion 2 days ago 0 replies      
Re: pricing...given the feedback here, maybe one thing you may want to do is to sell credits.

So maybe each screen cost X credits - but you sell the credits in packages. $10 for 10, $20 for 40, $50 for 150, etc. The idea being that you get more credits, the more you pay.

The beauty about it is that you front-load a lot of your revenue, which allows you to plan easier. It also is easier from the user's perspective, because I know I can always come back and use those credits.

You can even set some reasonable expiration time period - say 24 months - on the credits.

Anyway, that's just my $0.02.

Awesome product, by the way!

Edit 1: Another interesting side effect of implementing a credit system, is that you are setting the ground work for a marketplace. People can auction/sell some of their designs, and people can use their excess credits to buy those.

13
kgosser 3 days ago 2 replies      
Max Lynch and Ben Sperry are super talented. They are also getting really good at marketing! Great copywriting!
14
jenius 3 days ago 2 replies      
When I saw this and clicked it, I was preparing myself for a bootstrap-in-production-code abomination, but was pleasantly surprised. The design and interaction on this site are gorgeous and not boostrapped, and the tool looks really nice for putting together prototypes.

With all the horrible bootstrap default production sites i've seen lately, I've developed a pretty signficant hatred for the framework. But sometimes after seeing tools like this, I remember that it can be a very effective prototyping tool (as long as it's used only for prototypes, and you hire a real designer and/or front end guy to put together your production design and code).

That being said, awesome job with this site and this tool. Then only suggestion I could possibly make would be to change the header text from "build amazing websites" to "build amazing prototypes" - this at least helps to spread the concept that boostrap does not replace good, unique design and code.

15
adamt 3 days ago 1 reply      
Looks very pretty, but can't decide if this is useful or not. I love bootstap, but I am not convinced this is much quicker than a text editor (even if it is much prettier)

Quick bug report:

MacOS/Chrome (latest) - create navbar - click or double click on the bar, and you get errors.

Uncaught TypeError: Object #<Object> has no method 'apply' jquery.js:3332
jQuery.event.dispatch jquery.js:3332
jQuery.event.add.elemData.handle.eventHandle

16
gadders 3 days ago 2 replies      
Pretty cool - looks like it has come a long way since the Alpha.

What I'd like to see added would be an easy way to change the colour/font etc of elements to make them less bootstrappy.

17
slurgfest 3 days ago 0 replies      
I tried signing in with Google. Why do you need so much biographical information? Never mind, I won't use this.
18
tharshan09 3 days ago  replies      
Could you let us in on how to go about implementing such an interface? You don't have to go into specifics but if you used any tools or frameworks that you found invaluable etc, it would be nice to know.
9
Does everyone hate MongoDB? serverdensity.com
374 points by dmytton  3 days ago   155 comments top
1
jandrewrogers 3 days ago  replies      
I think a lot of the "hating" is a side effect of MongoDB being consistently oversold in terms of its capabilities and architecture. Many people who are not experts on databases discover this the hard way later. If the claims about it were qualified a little better and the limitations acknowledged more openly by its proponents it would help to mitigate this outcome.

I am indifferent to MongoDB but I do caution people that the internals and implementation are quite primitive for a database engine. It is the sort of database engine that a programmer that knows little about database engines would build. Over the years that has repeatedly manifested as problems no one should ever expect a database engine to have if properly designed. There is a reasonable point where you should not have to read the documentation to see if your database has a basic design flaw; there is an assumption of competent architecture for its nominal use cases.

MongoDB has improved in this regard over time and many other popular NoSQL (and SQL) databases have similar problems. Users are being asked to cut the database some slack for less than competent engineering choices but users just want a database to work. Being "simple" isn't enough.

10
NASA Rover Finds Old Streambed on Martian Surface nasa.gov
361 points by Sodaware  1 day ago   80 comments top 4
1
jpxxx 1 day ago  replies      
A quick Mars timeline:

Mars was formed around the time Earth was, but it was blessed with only 11% of Earth's mass and less than 40% of Earth's gravity field. Shortly after cooling solid, its "Noachian Era" was similar to proto-earth: warm, a thick atmosphere, plenty of liquid water on the surface, and probably a significant magnetic field.

But this era was still during the era of the Late Heavy Bombardment, a time in which the last dregs of the solar system were still settling out. Large asteroids still pounded the planets with regularity.

Unlike Earth, Mars had trouble maintaining its liquid iron magnetic field. Since it's much smaller, it cooled and thus congealed faster. And there's growing evidence that asteroid impacts were able to drive enough heat beneath the surface that interior convection was quelled, leading to a fragmented magnetic field.

Without an adequate magnetic field to deflect solar wind, the atmosphere was prone to shedding off pieces of itself into space. This was amplified by the lower gravity which meant holding on to lightweight gasses was even harder.

Over time, Mars cooled to the point where the major forms of tectonics ceased. The water locked up beneath the ground, rusted out pulverized basalt dust from the asteroid impacts, and frizzled in the radiation-baked atmosphere, floating off.

The seas and lakes dried, the rain stopped, and that... was that.

Three billion years later, we arrive on the scene and find out we have a little sibling. Then we send robots. We hope to find life, or evidence that it once lived. Characterizing how water worked in the Martian past is a part of answering that question.

2
jpxxx 1 day ago 3 replies      
Not to burst anyone's bubble, but this is not the staggering news it's being made out to be. It is good foundational geology, yes, but water has essentially been confirmed for years now.

The story of Mars in short: flop planet, can't hydrosphere.

There is extremely strong evidence that in the very early years Mars was capable of holding on to a great deal of water: Enough to cover the Southern Hemisphere. The streambed seen here is from that time.

That downer is that this was over 3 billion years ago. Through a variety of processes and for a number of reasons most of Mars' water was lost to space or trapped underground.

The billion-dollar question that would be epic to answer: Did Mars develop Or acquire life during the time it had liquid water on the surface and if so is there any trace of it left, alive or dead?

3
teeja 14 minutes ago 0 replies      
No they didn't find a "streambed". They found a bunch of clasts, with pebbles that wind couldn't move. Perhaps they were moved by something solid rather than liquid. Perhaps that was some kind of ice.

It's tiresome hearing of the "piling evidence" for water on Mars. Find some damn water. Prove it by melting it then boiling it on camera in a container with a thermometer.

4
bootload 1 day ago  replies      
"... NASA's Curiosity rover mission has found evidence a stream once ran vigorously across the area on Mars where the rover is driving. There is earlier evidence for the presence of water on Mars, but this evidence -- images of rocks containing ancient streambed gravels -- is the first of its kind ..."

Why isn't this front page, the implications are staggering.

11
Turn your 404s into lost children alerts notfound.org
352 points by jonny_eh  3 days ago   163 comments top
1
chimeracoder 3 days ago  replies      
I appreciate the noble motives behind this, but programs like the AMBER alert have not proven very effective[1].

Furthermore, abductions by strangers are incredibly rare; most abductions are by people that the child knows well (such as an estranged parent/relative, etc.). These are the cases for which the AMBER alert is most likely to have success, but they're also the cases for which it is the least likely to be necessary (ie, people investigating the case are going to be several steps ahead of a passerby who happens to drive by the billboard).

Also, 75% of children who are abducted and murdered are killed within the first three hours, so the shelf life of these alerts is incredibly small. Even smaller for people who are sitting at the computer when they see the alert, not driving on a freeway.

[1] https://en.wikipedia.org/wiki/AMBER_Alert#Controversy_about_...

12
Beta Late Than Never (Steam Linux Beta) valvesoftware.com
319 points by phenylene  2 days ago   146 comments top 3
1
octotoad 1 day ago 1 reply      
I find this announcement kind of bitter-sweet. It's awesome, amazing and inspiring that a company like Valve has put this much effort in to supporting Linux, no matter what their underlying motives are. At the same time, it's really sad that this is the biggest thing to happen to mainstream Linux gaming since Loki and id/Epic since the early '00s.

As somebody who was briefly caught up in the "convert-all-the-things-to-Linux" evangelism of the early 00's to the "fsck it, I don't care who uses it. I like it." attitude, I sincerely hope this makes an impact. Forget the FLOSS ideals and morals; if this means more 'power users' realize there's a viable Windows alternative for gaming (and more), everybody wins.

I know a lot of 'purists' will inevitably complain about the influx of 'noobs', but, fsck it. No matter what beliefs/philosophies you follow, it's always a good thing to be able to show people that there are other options.

As a long-time follower & sometimes contributer to the free/open Unix scene, this makes me proud.

2
DigitalSea 2 days ago  replies      
It's about time a major player got behind Linux, I'm glad it's Valve as they are arguably one of my favourite game developers not just because they make great games, but because of their overall attitude and work culture. Hopefully this gives Linux a little needed spotlight boost and perhaps gains Ubuntu a few new users.
3
zissou 1 day ago  replies      
I don't even play video games anymore, but I played a hell of a lot of CS back in the day. Many years have passed, and I jumped on the Linux bandwagon about 3 years ago, and am now supposedly a productive member of society. But dammit, all I can think about now is plowing over AK-47 slinging bastards running through the top tunnel on de_dust with my pump shotty, or mowing people down with the UMP or headshots with the scout. I don't know, I loved the obscure weapons.

I would increase my consumption of video games from a non-zero amount if I could play CS on Linux.

Make it happen Valve. You got me, I'm now nostalgic.

13
“grep -R doesn't automatically search amazon” launchpad.net
316 points by shrikant  3 days ago   84 comments top 8
1
famousactress 3 days ago 2 replies      
My favorite bit appears in the comment chain:

>akeane, please stop with the snark.

Why is my bug report a "snark", I have in good faith reported what I consider a bug with Ubuntu, namely the functionality that is being added to the GUI is not also being consistently added to the CLI tools that some many of us rely on.

You have chosen to mark my bug report "invalid", which is your total prerogative and I have total respect for you doing that.

I am saddened, however, you have chosen to resort to personal insults (being labeled a snark nearly made my monocle drop out!), rather than focusing on the technical issues presented.

2
spartango 3 days ago 4 replies      
While this is amusing, it's also rather disrespectful.

You may not agree with the decision by the Ubuntu team to incorporate a feature...and that's fine. There are plenty of ways to voice that opinion (blogs, email, forum).

Putting patently invalid(joke) bug reports in a system that's designed for actionable bug fixes just makes life a bit more troublesome for people actually trying to fix things. Harassing them about it through this channel seems like a waste of their time.

Seems like Shuttleworth responded to this gracefully, though. :)

3
FuzzyDunlop 3 days ago 1 reply      
This lacks ambition.

    $> cd ~/Movies/Avengers
... people who changed into this directory were also interested in
Marvel Avengers Assemble [DVD] £10.00


$> ls ~
Desktop Downloads Movies Music Pictures
... related items
Intelliplug - Desktop Version £12.95

4
jbermudes 3 days ago 2 replies      
It amazes me how easily this whole thing could have been prevented if they had just made the Amazon results show up in a separate shopping lens instead of the default lens.

Then even if it's pre-installed there'd be some reasonable expectation that a program designed to show you shopping selections would have to connect to a 3rd party server and send your query.

5
bnr 3 days ago 1 reply      
Great to see Shuttleworth taking it easy...

> grep --universe might be a better shortcut ;)

https://bugs.launchpad.net/ubuntu/+source/gnome-terminal/+bu...

6
SilasX 3 days ago 3 replies      
Can someone explain the context of this? I'm not getting the references. I mean, I know what grep is, and I know the -R option, but ...

Is the joke that grep obviously has nothing to do with searching a server you're not on, and the submitter is pretending to be someone who expected that it would search Amazon for results? If so, that's stupid, and not even a clever joke.

However, from some of the comments, it sounds like real users (ones actually competent enough to be using grep) are expecting this functionality -- perhaps it doesn't work on some Amazon storage site?

7
benwerd 3 days ago 0 replies      
Awesome that Shuttleworth replied, and in good humor.

I still totally buy that ads are step one to Ubuntu integrating more closely with Amazon, and the latter making a pretty bold device play.

8
tomrod 3 days ago  replies      
Snark aside, it's a valid point. Shouldn't CLI tools mimic GUI tools?
14
Myspace previews complete redesign myspace.com
314 points by michael_fine  4 days ago   262 comments top
1
lordlarm 4 days ago  replies      
As a computer scientist and an Opera user this actually insults me: http://i.imgur.com/7k0bN.png

Ironically Opera was the one who proposed the <video>-tag which this site uses for it's rotating LP, in 2007. [1] Way to exclude users.

[1]: http://en.wikipedia.org/wiki/HTML5_video#History_of_.3Cvideo...

15
Adobe Edge Web Fonts adobe.com
301 points by ujeezy  4 days ago   82 comments top 13
1
nostromo 4 days ago 3 replies      
Adobe really needs to drop the corporate marketer talk.

"Adobe Edge Tools & Services: New tools and services for a beautiful, modern web."

"Edge Web Fonts is conveniently built into Edge Code today and will be available in Edge Reflow and other Edge Tools & Services soon."

I'm reading all this and I still have no idea what Edge is or why I should care.

A much more informative link for the HN audience is this: http://www.edgefonts.com/

2
dgreensp 4 days ago 2 replies      
Careful, you can't legally serve these fonts with your app, you can only link to them. Am I the only one who finds this a big deal? It means you can't control the uptime of your fonts, and it seems like it would complicate development when you're offline, too.

From a business standpoint, it makes perfect sense to turn fonts into a "service" by hyping the hosting aspect. It would be like if jQuery said you can't serve jquery.js from your web server, you can only link to it, and then they start offering paid versions of jQuery.

I realize fonts are a commercial product, but my understanding is that Google Web Fonts really are free to use in your apps, whereas this is a free service. According to the terms, it is illegal to "retransmit" the "Service Materials".

3
thaumaturgy 4 days ago 1 reply      
Neat, but I don't understand why Adobe would do this. I already use Adobe Typekit, and $50/year gives me access to a huge library of fonts at an unlimited number of websites. For my purposes, I'm tempted to stop paying even that totally reasonable, meager amount and just use this free offering instead.

After following most of the links on the page, I can't find any mention of limits on pageviews or traffic. On the surface of it, that makes their free offering a little bit better than their paid offering.

4
gokhan 4 days ago 1 reply      
Sad that they don't support international characters (or at least not the full character set of the language we use here).

Fonts provided by Google are better in character range coverage. Anyway, thanks Adobe.

5
zalew 4 days ago 3 replies      

    <script src="http://use.edgefonts.net/league-gothic.js"></script>

I wonder why they pushed negotiation to the client side instead of doing like google. you need to send this http://use.edgefonts.net/league-gothic.js to your user instead of this http://fonts.googleapis.com/css?family=Source+Code+Pro in order to serve your font, and it won't work with js disabled.

6
kibwen 4 days ago 1 reply      
"In addition, Adobe will be applying its considerable font expertise to improving and optimizing a number of the open source fonts that are available in both Google Web Fonts and Edge Web Fonts. The teams from Typekit, Adobe Type, and Google Web Fonts are working to identify which fonts will benefit the most from our attention, and how we can best approach improving their rendering and performance."

Helping to improve fonts that are not just freely available, but freely available on services other than your own? It's almost like I'm starting to feel goodwill towards Adobe. It's rather strange.

7
systemtrigger 3 days ago 2 replies      
Browse the fonts listed in the select box:
http://2012.s3.amazonaws.com/edgefonts.html
8
jonny_eh 4 days ago 1 reply      
I've never heard of this "Adobe Edge". What's the deal here? I can't tell if it's a free thing or what.
9
adhipg 4 days ago 0 replies      
Am I right in thinking that they don't support Web Font Loader? [1]

Considering this is built on top of Typekit " which does support Web Font Loader " this is surprising.

[1] - https://developers.google.com/webfonts/docs/webfont_loader

10
rlt3 4 days ago 3 replies      
I see a few overlapping fonts (or, at least, overlapping names) from Google Web Fonts. But most seem new.

Is there going to be a new competition over who has the 'cutting-edge' fonts?

11
sjtgraham 4 days ago 1 reply      
Still no VAG Rundschrift. I have seen people request it on Typekit for years. Adobe is the licensor for this font, I have no idea why they haven't listened and either start offering it as a web font or explain why they won't/can't.
12
digitalengineer 4 days ago 1 reply      
Priceless!
"Our mission: move the web forward and give web designers and developers the best tools and services in the world."
And then: "Download a PDF version of this document (PDF, 47 KB)"
13
ghostblog 4 days ago  replies      
Damn........ they got a font called Lobster Two

This changes everything

16
USSD code to factory data reset a Galaxy S3 can be trigged from a HTML page exquisitetweets.com
290 points by EwanToo  3 days ago   174 comments top 4
1
forgotusername 3 days ago 2 replies      
Page text was:

    the USSD code to factory data reset a Galaxy S3 is *2767*3855# can be
triggered from browser like this: <frame src="tel:*2767*3855%23" />

2
GICodeWarrior 3 days ago 6 replies      
I created an Android app to intercept these requests and prevent them.
https://dl.dropbox.com/s/28lk6rn09x84qqg/AutoResetBlocker.ap...

Please test it and make sure it works for you.

  1. Open the above link on your phone
2. Install the application (it requires no special permissions)
3. Try this IMEI test: http://jsfiddle.net/kKFn8/
4. Check the box to make "Auto-Reset Blocker" the default action
5. Auto-Reset Blocker will show you the malicious number
6. Open this safe telephone number test: http://jsfiddle.net/tLHpw/
7. Auto-Reset Blocker will show the safe number and you will be asked which dialer to use
8. Select your normal dialer
9. Your normal dialer will open with the safe number

Again, please give it a try. If people like it, I will see about setting up an Android Market account to distribute it.

3
tomscott 3 days ago 2 replies      
Hello. I'm the guy who put this collection together. I've since tried to update it, and to hit 'delete' on it to avoid spreading misinformation, but Exquisite Tweets is still caching the original version. Mea culpa: I didn't do the research before passing it on.

There's been a lot of back-and-forth over whether it's true or not (check @pof's timeline for such), and a hell of a lot of people sending it on without double-checking. Myself included.

There is clearly a big security bug here (see the video linked), but it's extremely questionable as to whether it can be activated from a web page or whether it requires a bit of social engineering too!

[Edited to add: and just as I write this, @jwheare has cleared the cache and fixed the bug in Exquisite Tweets. Hopefully that should nip this in the bud.]

4
kristofferR 3 days ago  replies      
Here's a safe version of the exploit that displays your IMEI:
http://kristofferR.com/samsung.html

Check the html in your desktop browser first, for all you know I might as well be a malicious douchebag.

The exploit seems to require a stock Samsung Galaxy dialer, works fine on my cheap Samsung Galaxy Y but not on my friend's modded S3 with a vanilla Android dialer.

17
How to build a windmill jacquesmattheij.com
286 points by DanielRibeiro  1 day ago   53 comments top 22
1
btilly 1 day ago 3 replies      
The windmill is really cool, but reminds me of some trivia.

Remember how Don Quixote was fighting windmills because they were "giants oppressing the people"? He was right! And everyone in Cervantes' day knew it.

Windmills did not, contrary to popular belief, mostly spread as convenient labour-saving devices. Instead they spread as a way for the local lord to enforce taxes. When peasants had hand mills, there was no easy way to see how much food they really had grown, so it was hard to collect taxes. But if they go to the local miller, the miller takes your grain, grinds it, and then takes the lord's cut right there. There is no possible hiding of the food you've grown.

In countries with a strong peasant class, like Sweden, the lords were unable to introduce this form of central taxation. And I've read reports that hand mills were still in use there as late as WW II.

2
nettdata 1 day ago 0 replies      
I had to chuckle at how he nonchalantly just whips up a mill simulator or a Python script for the shape of the blade. Each of those alone would be worth digging into in detail.

Needless to say, this article is my motivator for the day.

3
singular 1 day ago 0 replies      
Really awesome article @jacquesm, I love these intricate, in-depth, long struggling-for-a-labour-of-love stories.

In fact, not to wax overly lyrical, I think a lot of stories that appeal to people in general follow that pattern. Something about the best aspects of humanity in that kind of endeavour.

4
lifeisstillgood 1 day ago 4 replies      
How do you find the time?

This is actually a serious question - for the various makers on HN, is it sacrificing other things, older (or no) children, flexible jobs, independant income. Or just really awesome time management?

I would like to know so either I stop beating myself up for bad time management, or improve it.

5
terhechte 1 day ago 1 reply      
That's a fantastic story. Now that the research is done, open sourcing this data could allow many people especially in poorer places of the world (once they can access the right tools, which might be a severe stumbling block) to create similar machines and gain power.

I wouldn't have thought that creating a windmill is such a difficult thing. Always cool to learn something new from an unknown domain.

6
starpilot 1 day ago 1 reply      
Technical error: the force of the wind depends on the square of the wind speed (q = rho * u^2/2). The cube law refers to power (proportional to rho * u^3/2).

Good article though. For those using software to design and analyze mechanical designs with "pretty good confidence," there's nothing more humbling than actually trying to build it.

7
davidw 1 day ago 1 reply      
> It has survived numerous storms and worked very well supplying our house with reliable power, far more reliable than the solar panels we had used exclusively up to the point the windmill was finished.

Italy certainly has its share of troubles these days, but the above made me smile a bit. There are still some good things here.

8
js2 1 day ago 0 replies      
This is the best thing I have ever seen on HN. Kudos.
9
patrickk 1 day ago 0 replies      
How great would it be if he could mass produce these! He gained incredible insight leading to variable pitch blades, process of making the various parts...if I were a homeowner & actually had cash I'd buy one in an instant.

Or even write a detailed ebook showing others how to do the same, and maybe sell the tricky parts like the stator sheets and blades to budding windmill DIYers. Great project.

10
tmh88j 1 day ago 1 reply      
Just being pedantic, but shouldn't it be called a wind turbine, not windmill? After all, it's outputting electricity and you're not grinding down grains or corn. Either way, this was very interesting. Good work.
11
erikpukinskis 1 day ago 0 replies      
I bet http://opensourceecology.org/ would be really interested in this, if they wanted to open source the plans.
12
jcr 1 day ago 0 replies      
jacquesm, For some strange and unknown reason, I'm recalling a story
about a very powerful magnet suspended in the air, and very heavy
metal table... also in the air, attached to said magnet.
13
tlb 1 day ago 1 reply      
Wind force goes up as the square of speed, not the cube. Available power goes up as the cube.

Friends of mine are building vertical axis turbines. The key to safety seems to be to make the blades out of lightweight foam & kevlar so that if they do shatter they aren't flinging big heavy pieces around.

14
kriro 1 day ago 0 replies      
This is great. If someone asks me the dreaded "what is a hacker" question again I'll link them to the article.
15
stephengillie 1 day ago 0 replies      
I'm impressed that you made your own plasma cutting table. That project must be worth a blog post on it's own!

What gauge and quality of metal is that?

16
Tipzntrix 1 day ago 0 replies      
How much of the difficulty could have been avoided if the windmill wasn't for generating power? This is an amazing project, but I wonder if it would be quite possible just to get something looking nice that stands strong (even against wind force at the cube of its speed) without such quality lathes and metalworking tools. Obviously, you have decades of experience in the field as you noted, and I doubt this would be possible for most people.
17
Enginoob 1 day ago 0 replies      
I do not believe you have built a windmill- you have built a proper wind turbine.

Very cool project- there are a good number of residential scale manufacturers out there, but I believe this is the first homebrew wind turbine I've seen.

18
ck2 1 day ago 1 reply      
What's the noise level like?

I've read the circular ones are far more quiet?

19
codediva 1 day ago 2 replies      
How much did it end up costing altogether, and how much would it cost to make another?
20
Datonomics 20 hours ago 0 replies      
He takes homebrew to a whole level by building his own CNC plasma cutter.
21
na85 1 day ago 1 reply      
Nice article. The aerodynamics behind windmills are extraordinarily complex, so 500 Watts is a pretty good output!
22
sneak 1 day ago 0 replies      
Sweet hack!
18
A Conversation With Randall Munroe, the Creator of XKCD theatlantic.com
284 points by iamwil  2 days ago   20 comments top 5
1
haberman 2 days ago 1 reply      
"I got email from a bunch of physicists at MIT saying, "Hey, I saw your relativist baseball scenario, and I simulated it out on our machine, and I've got some corrections for you." And I thought that was just the coolest thing. It showed there were some effects that I hadn't even thought about. I'm probably going to do a follow-up at some point using all the material they sent me."

God, that is just gold. Randall really has a gift -- he speaks to our imaginations so much that he can count on geeks around the world to participate in conversation with whatever he dreams up. Here's to hoping that we will get to see many more years of his creativity.

2
dkokelley 2 days ago 1 reply      
It's interesting to me the awareness Munroe has about creativity, inspiration, and working environment. He sees that some things are purely distractions (E.G. Reddit, cleaning the bathroom floor) while no distraction (E.G. a room with blank walls) is detrimental to his creative process.
3
epidemian 2 days ago 1 reply      
> "I'm not a huge fan of some of the infinite scrolling things that are happening now. I think it's really annoying to want to read partway through, and then you navigate away, and can't get back."

Yep; could not agree more. One solution to this problem could be adding some pagination information to the window.location while you scroll, like the list item you're currently looking, but that creates a new problem: if you go Back that'll just bring you to the previous list item, instead of the previous webpage; not very predictable IMO.

4
adambratt 2 days ago 1 reply      
Really interesting article. Thanks for posting it!

Lately it seems almost everyone of his articles has been really well thought out and insightful.

Does anyone know if this is what he does fulltime? I assume so seeing as how he said he spent a solid month and a half on the money chart.

5
tisme 2 days ago 1 reply      
Randall is a great guy with lots of really nice stuff. That said there is an absolute glut of xkcd stuff on HN compared to other sites that get mentioned here. xkcd links handily outnumber even wikipedia. One of the HN originated memes is the obligatory xkcd link.
19
The Pendulum Swings, Again jacquesmattheij.com
277 points by hawke  4 days ago   122 comments top 3
1
bad_user 4 days ago  replies      
The most interesting thing about these recent trends is human nature.

I remember back in year 2000/2001 how everybody was talking about freedom, open-source and the open nature of the Internet. I remember how the closed garden that Microsoft tried to create was frowned upon. Small companies that were picking the Internet as a delivery platform and using open-source/multi-platform technologies were on the forefront of innovation. Of course regular consumers and businesses never cared, but as Paul Graham once said, if you want to see the future trends in computing, you have to look at what hackers are using today.

Then OS X happened, this UNIX-compatible OS that was shiny and cool and all of your UNIX tools were compatible with it and you could run some pretty important proprietary software too, like MS Office or Photoshop. It was more productive for developers than Windows. Compared to Linux it was friendlier to all people. And suddenly Apple was hip again and it slowly captured the hearts of developers.

Then the iPhone happened and people didn't mind that it was a closed garden, because there has never been anything like it. Anything that Apple allowed on this new platform, it was taken as a gift, as it was their platform, so if they wanted to ban an app for "duplicating existing functionality" then openness be damned, it was their product after all. Then the stories about lone developers getting rich on the App Store happened, and people didn't mind being at the behest of Apple, as long as they could have some piece of that awesome pie.

Of course, countless of reasons were given by tech pundits, trying to rationalize the walled garden they've created - it is better for grandmas that have their PCs ridden with viruses, it is better for the protection of our children, it exposes computing to a wider mass (even though computing in this context means mostly consumption), it solves the problem of app marketing for individual developers without huge marketing budgets, etc... there's always some reason for why Apple was right to act the way it did. Even now that they've released a shitty GMaps replacement, some genuinely believe that they had no choice, when for a company like Apple there are always choices available.

Let's not forget for a moment the ultimate argument against this closed garden: if you don't like it, you are free to go somewhere else.

And now Apple started suing left and right, which in my opinion is what companies do when finding themselves in the innovator's dilemma, and is doing so while dropping the ball on new versions of its products. They are still successful and they might produce some more golden eggs in the future, but the innovation frenzy of the iPod era is over and they know it.

And yet people still cheer for them, even though as far as openness is concerned, Apple makes Microsoft look good. And it was only 12 years ago that people hated Microsoft with a passion for being an obstacle to innovation, even though Microsoft never banned any app from running on Windows or restricted its usage only to certain hardware (but surprise, since Apple has been doing it so successfully, Microsoft is going to start doing it with Windows 8 ... hurray for the renewed and totally not evil Microsoft).

I own an Android phone and an iPad. I love my iPad, but it was a gift and I secretly yearn for a Nexus tablet that has the same size + 3G. I also voted with my wallet against apps like Instagram, because I'm primarily an Android user and the aesthetic senses of developers like Marco don't really solve any my problems.

I also remember the day I got my Galaxy S, even though I owned an iPhone 3GS ... I got out of my way to buy one out of frustration because Apple was banning apps for blocking calls and SMS messages from specific phone numbers (but hey, look how it "just works"). And I predict similar frustration levels as use-cases for my iPad are unfolding. Already I'm pretty pissed off about my carrier having the ability to enable/disable the tethering option on my iPad.

If this is the future of computing, then I shudder to think of the consequences.

2
jonnathanson 4 days ago 1 reply      
The relative ubiquity of the internet is an important consideration here. In previous swings of the pendulum, computing (and/or the internet) was concentrated among a much smaller base of users, many of whom had strongly vested interests in the side to which the pendulum was swinging.

These days, the internet is certainly ubiquitous, and "smart" mobile devices are basically ubiquitous, if not yet entirely so. And the vast majority of today's users don't seem to care about privacy, open vs. closed ecosystems, etc. The sheer volume of users who either don't know, or don't care, about these things shifts a LOT of power into the hands of the current power players in the status quo. Hackers and power users are still major forces for change, and always will be. And, in time, they may prevail by offering better solutions to the masses. But the weight of the masses is heavier than it ever has been.

I submit that the fulcrum on which the hackers will tilt the masses isn't security, privacy, or anything of that nature. It's openness, and by proxy, IP management. It's the free transmission (or lack thereof) of ideas, information, and content. It's the interoperability (or lack thereof) of all of these things between devices and systems, outside the restrictions or central control of one or two major parties.

But the force currently arrayed against the growth of openness is a powerful one. It's convenience. And convenience is, arguably, THE most powerful driver in human psychology. Apple, Amazon, Google, Microsoft, telecoms, and other players in the content space are making plays to become total-solution providers. And the appeal of a one-stop, total solution (the Walmart Effect, if you will) is quite powerful to the masses. To overcome this appeal, the next great disruption will need to equate openness with convenience. Open must = easy. More precisely, open must be more highly correlated with ease than walled gardens are correlated with ease.

An historical side note:

AOL's monopoly in the ISP game, which lasted for so many years, came about because AOL offered the easiest, most idiot-proof, and most convenient internet experience. It was a walled garden, but beyond the wall, the typical user just saw a massive, untamed jungle full of complex systems, wonky communities, and seemingly insurmountable technical difficulties. AOL fell when people figured out how to make the world outside the wall more ordered, more appealing, and more uniformly approachable. Open standards had a lot to do with this. So did the appearance of portals, which represented a less daunting half-step between the Great Untamed Wild and the AOL garden. And search engines continued this evolution, making the point of entry into most users' internet experience very centralized, orderly, and convenient without walling everything up. Search engines created the subconscious appearance of a walled garden without actually erecting walls.

3
protomyth 4 days ago  replies      
I think the article misses one of the fundamental reasons that mobile is a closed environment: fear. It seems we treat things we can put in our pocket as more personal and much different than the twilight zone we expect out of computers.

I am sure, at some point, a program uploaded the Outlook address book of some Fortune 500 sales manager to the internet without telling him/her. Any problem or security breach in the PC world is a 1-day story.

The same thing happens on a mobile platform (iPhone) and C-level executives are dragged in front of Congress. I gotta tell you, if its a choice between programmer's rights and being dragged in front of Congress because some startup didn't use proper hashing, I would, as a CEO, limit programmers. Pure, simple, and a smart decision for 99.9% of my customers. I will bet if Google has more executives "requested" at Congressional Hearings then side-loading will disappear.

The post-PC devices are going to be locked down in the name of security. There is no downside to executives. Some developers will put up with it because of the money just like they did in the pre-iPhone days of mobile deployment.

I hate this because I know if I'd been born 20 years later, I would not be a developer. High Schools are not teaching programming anymore and the computers I learned to program on (Atari 400, C64, TI 99/4A) have no modern replacements (sub $200 with development tools included / available cheap).

Someone wants to change all this? Then build a modern day Atari 800 / C64. Not OLPC, because you cannot just buy one of those. Something I can hook to the internet and an old TV (since now most families have bought the 2nd generation of HDMI devices). Something that lets me program it.

20
Websockets 101 pocoo.org
273 points by the_mitsuhiko  4 days ago   50 comments top 13
1
btmorex 4 days ago 0 replies      
If anyone finds themselves needing to write a Websockets implementation, there's an awesome protocol test suite at http://autobahn.ws/testsuite

I wrote a C++ implementation for a side project and with the aforementioned test suite I actually found it pretty easy to get to 100% compliance. There's some ugliness in the protocol because of proxies, but it's definitely not the worst protocol in the world. The only big missing feature is compression and there's a proposal for that (you could certainly do application level compression, but I'd rather avoid writing compression code in JS).

2
tinco 4 days ago 3 replies      
Very nice readable article. Having dabbled with implementation a bit I agree with most of his points.

The author complains about websockets being frame-based instead of stream-based, because he argues if he would want a frame-based system he could easily model it on top of it.

I think frames are awesome. It is easier to build a stream on top of frames than it is to build frames on top of a stream in my opinion (you don't need an extra protocol for starters). Besides that almost anything you would like to do can be expressed in frames, saving most of us a lot of headaches.

I would like to add that websockets are not the end-all answer for web-game programmers. It makes some stuff easier, and reduces some overhead, but it's still on TCP, meaning realtime multiplayer games are still as good as impossible to implement.

Sadly all the issues with proxies go for UDP and then some, so I don't think we'll see any ubiquitous browser UDP protocol anytime soon :(

3
peterwwillis 4 days ago 5 replies      
Encryption is not a panacea. As more websites use HTTPS, more corporate networks are installing content filters that passively filter HTTPS content.

Yes. I just said corporate networks are spying on HTTPS connections.

It's really simple to implement. Get some kind of web proxy that has support (Websense is pretty popular, amazingly) and generate a root cert on the box. Then use a group policy on your AD server to distribute the root cert to all the client systems. Now Websense can decode the HTTPS traffic by issuing its own fake certs to clients and handle the "real" HTTPS handshake on the frontend proxy side.

Result? Your websockets are still going to get fucked with by proxies. Can we please stop building fake protocols on top of real protocols now?

4
jorangreef 3 days ago 1 reply      
Browser vendors must start to move away from top-down innovation, where they hoard APIs such as TCP and UDP and release sub-standard specs in their stead. Instead they must expose only the bare minimum OS low-level APIs (TCP, UDP, POSIX), keeping the surface area as small and powerful and direct as possible, and then let open source grow around this. Innovation needs to be decentralized, bottom-up. Not designed by committee.

For that to happen, we need to stop conflating "web apps" (trusted, installed, annointed with machine power) with "web pages" (accessed by single link click). At present, Web Apps are suffering and crippled by being lumped under the same security policy as web pages. But Web Apps need to have access to raw machine resources in the same way that Native Apps have access.

Those that don't seem to care for any of this insistence, tend also to be naive as to the massive differences between TCP and WebSockets, and IP and TCP and the whole stack in general. The WebSocket spec is a good example of people doing things in the most indirect way possible, with a maximum of red tape, as opposed to people doing things in the most direct way possible, with a minimum of red tape.

The Web as we have it in these respects is very much Animal Farm and 1984. There appears to be little thought leadership from the major stakeholders in this regard. People like Tim Berners-Lee are asking for change (http://lists.w3.org/Archives/Public/public-webapps/2012JanMa...), but the new incumbents don't seem to want to see.

5
jschrf 4 days ago 1 reply      
Excellent article, thank you for sharing.

As someone who has implemented a WS server and client and would like to be able to host them in Amazon's cloud, the notes about ELB are worrysome.

6
romaniv 4 days ago 4 replies      
Why do standards like WebSockets don't include anything that can be used in pure HTML? This would be a great tool for building efficient dynamic apps, if you could "submit" forms without going to a new page, and then get incremental page updates as a response.
7
Evbn 4 days ago 1 reply      
> Messages sent from the client to the server are obfuscated with a basic transmission mask of 4 bytes which is sent within each websocket package.

What is this? Some pretend security feature?

8
TazeTSchnitzel 4 days ago 1 reply      
(Shameless plug...)

I have a list of websocket libraries and a small echo server demo here: http://ajf.me/websocket/#libs

9
zafriedman 4 days ago 1 reply      
An even quicker introduction to the final specification of WebSockets: http://socket.io.

P.S. I'm aware that a) this doesn't help anyone not using Node.js on their server (it's not even part of my production stack at work (yet!), even though I'm bringing it to light here) and b) it's more than just WebSockets, for instance it will gracefully degrade on legacy browsers.

I just can't miss an opportunity to sing its praises because it has so many benefits over the simple implementation.

10
kaiserama 4 days ago 0 replies      
Pop-up Video Excerpt:

The author (Armin Ronacher) also authored Flask (Python web micro framework) and its template engine Jinja2. I've used Flask for a number of my most recent projects and really have enjoyed working with it.

Back to your scheduled broadcast!

11
chadrs 4 days ago 0 replies      
Man, I would have loved to have this about a year ago. Although then I would have missed out on some of the hilarious drama on the websocket mailing list (e.g. http://www.ietf.org/mail-archive/web/hybi/current/msg08668.h...)
12
rizwan 4 days ago 1 reply      
For folks thinking about using Socket.IO for websockets, it works very well, but remember that most open-source implementations ONLY implement WebSockets and NONE of the other fall-backs that are in the Socket.IO spec. So without some work, these libraries are often useless for mobile development, for instance.
13
bowmessage 4 days ago 0 replies      
Loving the article! Just thought I'd point out: "needles complexity"
21
Firefox turns 10 years old today (Phoenix v0.1) mozilla.org
251 points by mburns  5 days ago   44 comments top 17
2
tree_of_item 5 days ago 1 reply      
>You said this is a lean, lightweight browser, but it's 8MB! I laugh at your silly lies!

>Take it easy, sport. Phoenix has many files that override those in Mozilla, but it also has a new set of files. These files render a ton of files in Mozilla unnecessary, but we haven't yet stopped packaging the old files. It also still contains the modern theme and all the composer UI. In short, we haven't done any work yet to minimize the size, but we expect to be able to hit 6MB with a little work.

Things have changed quite a bit.

3
nsns 4 days ago 1 reply      
Congratulations! A toast to one of the most prominent (and, alas, one of the last, with Wikipedia) non-consumerist-centered tools of the web.
4
sabret00the 5 days ago 0 replies      
Happy birthday. Thank you for serving me with my best interests at heart for so long.
5
acomjean 5 days ago 1 reply      
10 years ago the browser market was very different. ie dominant and many sites (banking in particular) "ie only". Mac had a version of ie and linux as a web surfing platform was wanting.

Firefox has been a good thing.

for those using firefox, the browser has more information.

go to the address
about:mozilla

Maybe "information" is a stretch, but its fun.

6
pan69 5 days ago 4 replies      
The Firefox Wikipedia page states:

"Initial release November 9, 2004; 7 years ago" [1]

Maybe they're referring to the 1.0 release? I do remember using Firefox way earlier than that, it must have been 2000 or 2001. I guess those were 0.x releases. Wikipedia doesn't seem to say anything about that.

[1] http://en.wikipedia.org/wiki/Firefox

7
mariuz 4 days ago 0 replies      
Here is my guide to run it under ubuntu karmik
http://mapopa.blogspot.ro/2009/11/firefox-01-aka-phoenix-on-...

My guess the same instructions should work on ubuntu 12.04 lts

8
joneil 5 days ago 0 replies      
I started using Phoenix at v0.3 and remember just how "new" it felt. At the time I was still running Windows (maybe even 98?) and was experimenting with Linux, but hated the netscape/mozilla options. Phoenix was fast, had tabs, a popup blocker, and no ads (unlike Opera). I was a proud user then, and I think they've done remarkably well given the massive competition from three commercial software giants - I'm still a happy user today. Well done Mozilla.
9
chris_wot 4 days ago 1 reply      
It's interesting that 0.3 had the ability to right click on an image to block the server it comes from... I'm assuming this is a preference or an extension nowadays!

http://www.mozilla.org/en-US/firefox/releases/0.3.html#new

10
RyanMcGreal 4 days ago 0 replies      
After making the jump from Netscape 4.5/4.72 to Mozilla 1.2, I stayed on Mozilla until Firefox 1.0 came out. I've been using Firefox ever since. I've got Chrome installed, mostly to check that websites work properly in it, but it never pulled me away from Firefox.
11
peterwwillis 4 days ago 0 replies      
I really miss Phoenix. The tiny size, almost non-existent feature set, incredible snappy speed, and still wide compatibility with most (if not all) websites of the day.

Then there's Firefox. It's taken years to get to a point where you aren't swapping from three page loads. It has hardware graphics rendering and built-in video codecs. An entire development environment. And "helpful" features that try to guess what you're thinking and end up using more bandwidth, i/o and cpu than is necessary.

I know, i'm a luddite, i'm old-fashioned, i'm hindering progress. But get off my lawn! I just want a single tool that does something well. What's wrong with just releasing plugins for the features that aren't strictly text and image web content?

12
rangibaby 5 days ago 0 replies      
Already?! HB Firefox. And thanks for showing us that there was a better way than IE6.
13
dimitar 5 days ago 6 replies      
Early Firefox felt really responsive and it was more functional and safe than IE.

What happened? I still use it, but I definitely don't feel the same about it.

14
starik36 4 days ago 0 replies      
I remember installing it for the first time and it felt just right. The other browsers at the time (Mozilla Suite, Netscape 6.x, IE) just felt so bulky.

You could tell right away that this software was going to go places.

15
yenoham 4 days ago 0 replies      
I never thought I'd get to a time where web browser milestones began to make me feel old.
16
dain 5 days ago 0 replies      
Second line down "earch" should be "Search", in case anyone at Mozilla is reading this.
17
noibl 5 days ago 0 replies      
What, no gimmick? PR fail.

(Not that it seems to have done much for Opera, but Mozilla is in a better position to capitalise.)

22
California passes groundbreaking open textbook legislation creativecommons.org
247 points by drostie  20 hours ago   67 comments top 11
1
waterlesscloud 19 hours ago 3 replies      
I am hugely in favor of this move. It's a tiny first step to decreasing the cost of education and further opening educational access to truly everyone.

But, if I may indulge in a cynical moment, I note that one of the touted benefits of CC-BY in the article is that commercial companies can develop tutorials from these tax-payer funded books.

What if the tutorials then become the required texts? Are there any safeguards from this? If not, it's inevitable that it will be done.

2
gosub 18 hours ago  replies      
There is something I don't understand. From the article it seems that this only provides for the creation of free/affordable textbooks, but does not force the use of them in colleges. Am I wrong thinking that affordable textbooks are already available? What is being done to prevent colleges and professors from using 150$, annual, nth edition of the same old textbooks?
3
rorrr 19 hours ago 0 replies      
Fucking finally.

I hope the rest of the country follows.

p.s.

It's pretty sad this obvious move is considered "groundbreaking". All it took is a pair of balls.

4
tingletech 19 hours ago 0 replies      
I work for the California Digital Library, and most of my work involves using or writing open source code; and this establishes the California Open Source Digital Library.

Ironically, we launched a textbook site (only for UC) this week http://licensed.cdlib.org

5
jug6ernaut 15 hours ago 2 replies      
I do not know enough about this to be for or against it. But as a recent college grad it leaves me with two obvious questions.

1. This litigation doesn't address the issue of overpriced textbox, simply bypasses it all together. Is there anyone out there saying that the current selection of textbooks is bad in some way other then there price? I find it astounding that they would pass litigation in an attempt bypassing a whole industry instead of actually addressing the issue.

2. Why can these textbooks be made without this ligation?

6
Zenst 16 hours ago 1 reply      
Sound brilliant with one small question "So, in addition to making the digital textbooks available to students free of cost, the legislation requires that print copies of textbooks will cost about $20. ". Small question about biasing for those students who have access to a device that can read the free open source version, for many a hardcopy is also more suitable. So to legaly control a price saying "about $20" does somewhat raise questions as to the definition of "about" as a fiscal term and more so. Who do you pay to print out your own copy? Becasue as it reads, if you print it out then you should be paying "about $20" for it and this is completely unclear.
Least to me it is.

Still motives are good and anything moving in the right common sence direction has to be applauded.

7
justincpollard 9 hours ago 1 reply      
So now textbook authors are forced to license their books in such a manner? Or is this a choice that certain authors can make?

If it's the former, financial incentives for textbook authors have fallen precipitously. So, the question is, who's going to write these "free, openly licensed digital textbooks for the 50 most popular lower-division college courses offered by California colleges"? Might the authors receive government subsidies? If this is true, then we're simply shifting from a market mechanism to a government funded model. Either the cost of writing a textbook won't actually change, the quality will go down, or we'll have many fewer options to choose from.

8
jonny_eh 19 hours ago 1 reply      
So which textbooks will this include? Do they need the publishers' permission to redistribute their existing work under a CC BY license?
9
jimhefferon 13 hours ago 0 replies      
I offer a Linear Algebra that is Free. Some years ago I was contacted by some people in California and prompted to apply to become some kind of official Free source in that state; it involved filling out some forms on some web sites. That I can tell, nothing ever came of it. So I'm a bit dubious about this initiative.
10
Steuard 16 hours ago 0 replies      
I wonder if they'll use the existing CC-BY textbooks produced by OpenStax (or others) for some of these courses? http://openstaxcollege.org/
11
emehrkay 12 hours ago  replies      
As a non-California resident, Jerry Brown seems to be doing a pretty impressive job. He seems to talk about the real issues, especially when Cali's budget, and puts good legislation through.
23
Torvalds' quote about good programmers stackexchange.com
241 points by lx  5 days ago   104 comments top 7
1
robomartin 5 days ago 2 replies      
Absolutely right. I was lucky enough to learn this in college. Although, I did not learn it from the CS professors but rather my physics prof. He was a champion for a language called APL and he actually cut a deal with the CS department to accept credits for taking an APL class he was teaching as a substitute for the FORTRAN class. APL was an amazing mind-opening experience.

Throughout the APL 101 and 102 courses he would repeat this mantra: "Work on your data representation first. Once you have fully understood how to represent your data start thinking about how to manage it with code."

He would throw this at us permanently. At the time it sounded like our Physics prof had lost his marbles (he was a very, shall we say, eccentric guy). It would take a few years after college for me to realize the value of that advise.

Put another way, our business is managing data of some sort. Whether you work on embedded systems or web applications, you are always dealing with data. You can make your programs far more complicated than necessary by neglecting to choose the right (or a good) representation of your problem space (data).

I equate it to designing an assembly line. Anyone who's watched a show like "How it's Made" cannot escape the realization that efficient manufacturing requires engineering an efficient assembly process. Sometimes far more engineering work goes into the design of the manufacturing process and equipment than the part that is actually being made. The end result is that the plant run efficiently and with fewer defects than alternative methods.

In programming, data representation can make the difference between a quality, stable, bug-free and easy to maintain application and an absolute mess that is hard to program, maintain and extend.

2
antirez 5 days ago 2 replies      
This is one of the few programming quotes that is not just abstract crap, but one thing you can use to improve your programming skills 10x IMHO.

Something like 10 years ago I was lucky enough that a guy told me and explained me this stuff, that I was starting to understand myself btw, and programming suddenly changed for me. If you get the data structures right, the first effect is that the code becomes much simpler to write. It's not hard that you drop 50% of the whole code needed just because you created better data structures (here better means: more suitable to describe and operate on the problem). Or something that looked super-hard to do suddenly starts to be trivial because the representation is the right one.

3
michaelochurch 5 days ago 2 replies      
The problem with code quality is that there's so much AND-ing that most people give up on understanding this massively difficult problem that is as much social and industrial as it is technical.

One of the first things you learn about systems is that parallel dependencies (OR-gates) are better than serial dependencies (AND-gates). The first has redundancy, the second has multiple single points of failure. That's also true with regard to how people manage their careers. Naive people will "yes, sir" and put their eggs into one basket. More savvy people network across the company so that if things go bad where they are, they have options.

To have code quality, you need people who are good at writing code AND reasonable system designs AND competent understanding of the relevant data structures AND a culture (of the company or project) that values and protects code quality. All of these are relatively uncommon, the result being that quality code is damn rare.

4
mtkd 5 days ago 6 replies      
You can normally fix bad code - fixing bad data structures is not usually easy or even possible.

It's why I've still not fully bought in to 'release early release often'.

I prefer to defer releasing for production use until really satisfied with the structures - this way you have no barrier to ripping the foundations up.

If not 100% comfortable with the model - prototype a bare metal improved one (schemaless DBs ftw) - if it feels better start pasting what logic/tests you can salvage from the earlier version and move on.

5
barrkel 5 days ago 3 replies      
This is approximately the same reason as why I start out writing most of my programs by creating a bunch of types, and why I find dynamic programming languages uncomfortable to use.

I'm less and less a fan of the ceremony of object orientation, but I think there's a lot to be said for having a succinct formalized statement of your data structures up front. Once you understand the data structures, the code is usually easy to follow. The hardest times I've had comprehending code in my career, apart from disassembly, have been from undocumented C unions.

6
InclinedPlane 5 days ago 1 reply      
To paraphrase, if I may, a novice imagines that the goal of programming is to create code which solves a problem. This is true, but limited. The real goal is to create an abstract model which can be used to solve a problem and then to implement that model in code.
7
hermannj314 5 days ago  replies      
Next week on Hacker News:
Bad Programmers worry about their code. Good programmers ship.

"Bad programmers [technique A on programming KPI metric N1]. Good programmers [technique B on programming KPI metric N1]."

Responses: Someone will ask, "What about metric N2?" And someone will say, "What about technique C?" Someone will post a personal anecdote showing that people really underestimate the value of A. Someone will respond to that by posting a hyperlink to an anecdote that shows technique B really is what matters.

24
Entire field of particle physics is to switch to open-access publishing nature.com
239 points by ananyob  3 days ago   32 comments top 11
1
Steuard 3 days ago 1 reply      
The SCOAP^3 consortium's news announcement about this can be found at http://scoap3.org/news/news95.html

Something like this may have been inevitable in particle physics: with essentially all articles appearing freely on arXiv.org, the journals have already been starting to look less necessary. That reality may have made them more willing to agree to something like this. It will be interesting to see, ten years from now, whether this model continues to be viable or whether the field will have adopted some entirely different mechanism for peer review.

It appears that the articles will be published under CC-BY licenses. The definition of the affected articles is quite broad, too: "SCOAP^3 Articles are defined as either all articles appearing in journals mostly carrying High-Energy Physics content, or articles appearing in “broad band” journals which have been submitted by researchers to arXiv.org under the corresponding categories."

The close integration with arXiv.org is pretty much essential for this to work, but I was still a bit surprised to see that arXiv categories are used as the defining feature of "particle physics content". (For those in the know, those categories are hep-ex, hep-th, hep-ph, and hep-lat.)

2
pav3l 3 days ago 2 replies      
Why, in this day and age, is there still publicly funded research that is not open access? Also I could rant for hours on the need to make your data available for any peer reviewed publication.
3
SeanDav 3 days ago 1 reply      
It seems to be a really good step towards breaking away from the current journal publishing monopoly which makes access to cutting edge research so expensive. I hope other branches of science adopt this as well.
4
nkurz 3 days ago 1 reply      
Physical Review D, the journal that publishes most papers in the field, negotiated a fee of US$1,900 per article “on the principle that we should maintain our revenue”, says Joe Serene, treasurer and publisher at the American Physical Society, which owns the journal.

The "principle that we should maintain our revenue"? I like that principle. Which box do I check to have that apply to me as well?

I don't understand why SCOAP3 isn't driving a harder bargain. They are anticipating a $10MM budget --- wouldn't this be enough to hire some good editors and publish online?

If the whole field is behind this, worries about "impact factor" should disappear. Or is the problem that salary/tenure/promotion is tied to an outside assessment of "impact"?

5
pbsurf 3 days ago 0 replies      
Almost all physics papers (not just particle physics papers) are posted on arXiv. Papers on arXiv are usually updated to the final published version, although the peer review process rarely produces significant changes.

It is unfortunate that university libraries will be continuing to send money to publishers who add almost no value to the scientific process.

6
smoyer 3 days ago 3 replies      
I guess my question is whether we need the journals at all ... the process of printing the journal itself certainly isn't the hard part, but is there a way to properly do peer review in an open system?

I'm certainly happy to see this consortium has moved in the right direction, but could it be even more open?

7
tingletech 3 days ago 1 reply      
"Upfront payments from libraries will fund the access." Great, so the Library still has to pay for it...
8
creat0 3 days ago 0 replies      
It's great to see Nature, itself a high-priced journal, running this story.

arXiv.org really makes downloading papers a breeze. If only it were so easy in other discliplines. It's a lot easier than downloading articles from, say, ScienceDirect. The latter is, despite its name, a lot less "direct" than former. Just count the HTTP redirects and the number of domain names looked up. And many journals seem to have their own idiosyncracies vis-a-vis downloading. arXiv is by comparison beautifully simple and reliable. It has a nice consistency about it.

9
Tipzntrix 3 days ago 1 reply      
The hacker way extends beyond computing.
10
frozenport 3 days ago 0 replies      
We used to joke that if you were in the Tevatron (Fermi lab) parking lot you were a co-author!
11
rbanffy 3 days ago 0 replies      
They should drop the "Consortium" from the name. It's cleaner.
25
Google Spanner's Most Surprising Revelation: NoSQL is Out and NewSQL is In highscalability.com
239 points by aespinoza  4 days ago   79 comments top 9
1
nostrademons 4 days ago 4 replies      
It's really funny to watch tech journalists try to write about Google infrastructure from the outside, based only on one paper...

Hell, it's usually really funny just to watch tech journalists try to write.

2
cgs1019 4 days ago 2 replies      
This article comes across as really cynical and entirely lacking in the kind of rigor and detail I have previously found on highscalability. Spanner is really mind-blowingly cool tech. I thought this article was much more informative and worth the time to read: http://news.ycombinator.com/item?id=4562546
3
sigil 4 days ago 8 replies      
Buzzword headline aside, the Spanner paper is great and worth your time. As is the BigTable paper, the Dremel paper, and the Paxos Made Live paper.

I read the Google whitepapers and wonder, is there anywhere else one can go to work on real solutions to distributed systems problems? At smaller scales you can cheat -- you don't need Paxos, you can get away with non-consensus-based master / slave failover. You can play the odds with failure modes. At Google's scale, you can't: behavior under normally unlikely failures matters, probability matters, CAP matters.

4
nolok 4 days ago 5 replies      
NewSQL ? Seriously ? Do we really need another low-quality buzzword for people to re-use everywhere ?
5
Dave_Rosenthal 4 days ago 2 replies      
Disclaimer: I'm a co-founder of a database company (FoundationDB) building a scalable, ACID database.

I couldn't agree more with main quote that they pulled from the paper, expressing the difficulty of (even great) programmers having to "code around the lack of transactions." Ease of development is one of the biggest benefits of transactions.

However, another huge benefit that didn't get much play in the article is the freedom that transactions afford you to build abstractions and other data models on top of whatever you are given. In our product's case, a low-level ordered K/V store is used for a storage layer and several different data models are exposed on top (see http://foundationdb.com/#layers).

I think the future of databases has a diversity of data models and query languages (including SQL, document, K/V, columnar, etc.). I also think the future of databases is ACID. It seems like more and more of the NoSQL early adopters (and creators) are coming to the same conclusion.

6
aklein 4 days ago 0 replies      
Here is a google engineer giving a keynote on Spanner:

http://vimeo.com/43759726

7
kyt 4 days ago 0 replies      
"Maybe this time Open Source efforts should focus elsewhere, innovating rather than following Google?"

There's a ton of innovative projects in the open source community, but it's difficult to convince people to use them. Developing a clone of a Google tech has a built in marketing advantage: "Google uses something like this."

8
cageface 3 days ago 0 replies      
So the correct and responsible thing for Google to do now would be to patent the shit out of this and then sue back into the stone age anybody that implements anything even vaguely similar, right?

I mean, the social fabric depends on companies protecting their innovations, right?

9
realrocker 4 days ago  replies      
Imagine an amateur programmer walking into this whole debacle.
26
SHA-3 to Be Announced schneier.com
238 points by stalled  4 days ago   53 comments top 7
1
exDM69 4 days ago 2 replies      
I implemented the Skein hash in a crypto class at my uni. What is remarkable about that hash is that it has a "hash tree" mode which provides an interesting opportunity for parallelization and doing hashing of partial data. In contrast, many traditional hash algorithms are inherently sequential by nature.

On the other hand, as Mr. Schneier points out in the article, the Skein hash utilizes a modified Threefish block cipher, while many of the SHA-3 contestants were AES-based (edit: seems like none of the finalists are). Now we have a hardware AES implementation shipping in mainstream processors, so it gives an edge to the AES-based hash functions out there.

edit: I went through the list of finalists and it seems none of them actually use the whole AES block cipher, although several of them use AES S-boxes or other parts of AES.

2
helper 4 days ago 2 replies      
I'm a bit surprised that Schneier is advocating for "no award". Even if the SHA-3 candidates are not fundamentally better than SHA-512, we really do need a standardized algorithm that has built in protection from length extension attacks.
3
swordswinger12 4 days ago 2 replies      
I think NIST should have a big Apple-esque unveiling event for new crypto. I for one am that excited about SHA-3.
4
Zenst 4 days ago 3 replies      
Interesting that the reason for SHA-3 has been missed in that the finalists offer no better way to hash with the main difference being some are faster and some slower than the best SH2 variations.

What does this mean, well in effect no extra value is being directly offered, sure some have extra abilities by design like being more able to liberate parallel processing by sbeing able to split the data to be hashed into chunks and work on partial blocks of the final data and use the results to get the final hash result. That is nice.

But when it comes to brute forcing then being faster works against you, also the ability to work on partial chunks of the data allows you to modify the code and rechecking the partial hash for the part your changing until you get the same result, this alows you to do nasty things to code and get the official hash answear alot easier than having to rehash the end result every time and getting the same result or modifying the code to get the same result (usualy have area you jump over all nop and modify that to influence the hash, but more sane ways to do this but offtopic).

So in essence any hash that can be run faster in any way will make it weaker in terms of brut forcing (yes I know people assume there passwords will be the last one on the list to be checked bia brute forcing and assume if it takes 10 years to test all variations then there password is 10 years strong, you see the flaw in mentality there).

Now NIST still have an opertunity here and it is a simple, tried and tested approach and that would be to have all finalists winners and have them all in the standard as variations. This then allows end users/admins to pick there variation of choice or even perish the thought allow mixed usage so say your /etc/password file could have some users using one variation, others using another, etc. Whilst it add's no obvious extra benifit, it will allow more variations and in that fallbacks/choice and that is what n BIT encryption/hashing is all about, each bit being a choice in a way.

So in summary I believe NIST should let them all win and have SH3.n with n being the variation of finalist, let them all win, choice is good and that is what n bit encryption is after all, extra choices.

5
dochtman 4 days ago 0 replies      
djb thought in March it was going to be Keccak:

https://twitter.com/hashbreaker/status/183552364953878528

6
JeremyBanks 4 days ago 0 replies      
Schneier picked a very misleading headline here. I was wary when I saw that the NIST page he links regarding the timeline still hasn't been updated since June, and then I saw him reply in the comments:

"> When will SHA3 be announced? Were you given special information the rest of us don't have access to?

I have no inside information on when SHA-3 will be announced. My guess is that they've made the decision, and are going over the final rationale again and again.
My guess is that it won't be Skein."

Even though this is the original title, I'd prefer the HN title be edited to something about Schneier hoping NIST will pick no SHA-3.

7
Zenst 4 days ago 0 replies      
Out of interest these hash functions can be implemented in very few bytes with 100 being mooted for this skien hash. With that in mind when it comes to brute forcing I do wonder if it would be possible to just brute force a better solution easier than brute forcing a hash, I say that in jest.

But it does make you reliase how much empressive stuff you can do in just a few bytes and what else is out there.

27
NoPassword alexsmolen.com
235 points by BerislavLopac  3 days ago   145 comments top 7
1
mapgrep 3 days ago 11 replies      
This turns your email inbox into a giant password manager, except without the extensive storage encryption used by real password managers. Thought experiment: What are the possible unintended consequences of that?

Just a few off the top of my head:

1. Easier for email hackers to detect sites where you have logins. (Controlling someone's email usually means controlling most/all logins, but it takes some digging to get a good list of vaulable logins. With this solution, most of the list is on the first page or two of the inbox.)

2. Harder to detect being hacked. (Previously, a hacker with email access would have to reset your passwords, and you will notice that at least some passwords have changed. Now the hacker just has to delete any incoming authentication emails after reading them.)

3. Losing a job becomes potentially more catastrophic. (Hope you didn't associate too many passwords with your work email, because IT wiped your account while security was escorting you out the door. And before you say only dumb people use work email for personal accounts, consider that part of the idea of this nopassword system is to help "dumb people" who fail to (for example) use password managers.)

4. Your email provider now has a nice easily mined record of what sites you log into most often. But, hey, I'm sure we can all trust Google not to use that information in a terribly creepy manner, right?

2
eCa 3 days ago 5 replies      
I hope that Mozilla Persona[1] (or similar) will solve this problem soon.

[1] https://login.persona.org/

3
dkokelley 3 days ago 0 replies      
I think it is interesting that a good portion of HN users are quick to point out the security issues in a scheme like this, considering the fact that this is basically the current security model with the intermediate steps removed. The only thing that makes it different from logging in to my bank is that if I try a bank password reset I might get asked what my mother's maiden name is.
4
moeffju 3 days ago 1 reply      
We did this on Toptranslation.com as an alternative to the normal password-based login. The thinking is, since password resets go to your e-mail, anyone who has control over your mail or mail server can get into your account anyway. It's a tradeoff between user (mostly enterprise customers') convenience and security. Since all orders made through the system require another confirmation, we decided it was worth not having to handle the "I forgot my password" support tickets. Haven't had any problems with it, I think it makes sense for low- to medium-security authentications.
5
kennu 3 days ago 3 replies      
Makes me wonder, if emails and SMS messages start to carry more and more plaintext passwords (and links) providing direct access to everything, how will their security hold up in the long run? I don't think they are inherently very secure channels.
6
drivebyacct2 3 days ago 1 reply      
This technique hits the front page once every other week. With the same flaws. Use BrowserID. It's better than this and it paves the future towards actually being able to truly transition away from passwords.
7
endianswap 3 days ago  replies      
I find it amusing that a page discussing passwords/security would throw a couple of SSL certificate warnings when loading it.
28
Why American Phone, Cable and Internet Bills are so High yahoo.com
235 points by bretpiatt  4 days ago   252 comments top 2
1
DanielBMarkham 4 days ago  replies      
This is one of the things I like ranting about, so I'll try not to do that.

I also live in an area that will never see high-speed internet. We were sold out by both the feds and our state legislators. You can argue that perhaps these government officials were misled or uniformed, but I remain dubious.

We were able to construct an interstate highway system. We were able to wire this country with electricity. We were able to wire it for telephones. The only reason we can't wire it with fiber is because of poor government management of eminent domain. We should kick most of those responsible out of office. To see a bunch of yahoos on TV telling me we need the road paved when most of their constituents don't have enough bandwidth to get high-quality instruction over the web? Or to work over the web? It's the Information Age, bozos. Something is wrong somewhere.

The worst part? I do not expect this to get fixed any time in the next couple of decades. Not only is it broken and hurting, but there's no political incentive to fix it. In fact the incentives run the other way.

2
seiji 4 days ago  replies      
Report from the field: On Friday I walked into a cell phone shop and told them I wanted a SIM mainly for data. They suggested a prepaid "all you can eat data" SIM good for one month (realistically it alerts at 15 GB usage). €20 ($26 USD). The €20 goes towards pay-per-minute and pay-per-text usage, but who calls or uses non-iMessage these days?

Back home, my Verizon bill will be $120 every month with one iPhone 5 and one LTE iPad sharing the same data plan (breakdown: $40 for iPhone 5, $10 for iPad, $70 for unlimited talk/text + 4GB shared data (or, replace $70 with $100 for 10 shared GB data for a grand customer abuse total of $150 every month)). You can't seem to even get non-unlimited voice/text anymore. Let's not bring up the $35 fee per device they charge to "activate" it either -- I don't want to headasplode in this room (it has white walls).

Data only, please. For under $3/GB, please. RFN, please, not in another 10 years.

29
Why I think Rust is the "language of the future" for systems programming winningraceconditions.blogspot.com
231 points by pcwalton  2 days ago   177 comments top 3
1
haberman 2 days ago 5 replies      
I'm a die-hard C guy. My motto for years has been "you can pry pointers and address spaces from my cold, dead hands."

Of the new languages I've seen lately, Rust is my favorite. I love how it gives me better ways to express things I actually want to say without imposing GC on me.

But even so, I can't see myself actually using it for much, because writing in a language other than C means buying in to that language's runtime. Buying into one language's runtime means that your code won't play nice with other languages' runtimes.

If I write a library in Rust, how can I expose my types and algorithms to Ruby, Python, Lua, etc? How will Rust Tasks play with Python threads? What if I use a Rust Pipe to send a Python value between tasks? How do I keep Rust from doing a GC pass while I'm holding the Python GIL? etc. etc.

Programming Languages by their nature want to be at the center of your world. If you buy into their abstractions, everything works nicely. But if you try to mash two of them together in a single process, you start to suffer from the fact that their abstractions overlap and don't interoperate at all.

If you're only writing an application (ie. not a library) and never want to embed other languages into your application, then this might be ok. But I'm more interested in writing shared functionality that is useful across languages. Why should the whole stack of parsers, crypto, compression, etc. have to be written separately in each language? Life is too short to do some great work that is only usable by one language community -- computing is so big and changes so much that one-language-only functionality is at best limiting your market and at worst dooming your code to obsolescence when the next big language comes around.

So as much as I instinctively like Rust, I think I'll be sticking with C.

2
kibwen 2 days ago  replies      
Out of mild concern over the title (not that it ought to be changed, TFA doesn't really have a meaningful title), I'd like to preemptively defuse any potential flame war.

Go and Rust are not really competing. There may be some overlap in domain, but they occupy different niches and will likely appeal to different crowds. Go will appeal more to people who prefer its focus on conceptual simplicity and its "opinionated" nature (very much like Python). Rust will appeal more to people who prefer "functional" trimmings (algebraic datatypes, pattern matching, liberal use of higher-order functions, Scheme-style macros) as well as folks fed up with C++ who aren't willing to give up complete control over memory and performance characteristics. The fact that both languages are built around a similar concurrency model is just convergent evolution.

It's tempting to cast Go vs. Rust as Google vs. Mozilla and/or Chrome vs. Firefox, but there's no practical reason that both languages cannot peacefully coexist and thrive.

3
megaman821 2 days ago  replies      
Rust is what I had hoped Go would be.

Google employs some of the brightest computer science minds in the world and turns out stuff like Go and Dart, which seem to be more aimed at enterprise Java programmers rather than computer scientists or programming enthusiasts.

30
Why I'm not leaving Python for Go uberpython.wordpress.com
216 points by ubershmekel  5 days ago   239 comments top 2
1
cletus 4 days ago  replies      
I have mixed feelings about errors as return codes. Then again, I have mixed feelings about exceptions.

There are two general use cases for exceptions:

1. Unexpected (typically fatal) problems;

2. As an alternative to multiple return values.

(1) is things like out of memory errors. (2) is things like you're trying to parse a user input into a number and it fails. I despise (2) for exceptions. It means writing code like:

    try:
f = float(someText)
catch ValueError:
# I just parsed you, this is crazy,
# here's an exception, throw it maybe?

where this gets particularly irritating is when you start writing code like this:

    try:
doSomething()
catch ValueError:
pass

I nearly always end up writing wrapper functions around that crap.

Java is worse for this because some libraries (including standard ones) abuse checked exceptions for this. I actually prefer:

    if f, err := strconv.ParseFloat(text); err != nil {
// do something
}

or even:

    f, _ := strconv.ParseFloat(text);

for this kind of scenario.

For the truly bad--typically fatal--error conditions and cleanup, IMHO defer/panic actually works quite well. I certainly prefer this:

    f := File.Open('foo')
defer f.Close()
// do stuff

to:

    try:
f = open('foo')
# do stuff
finally:
if f:
f.close()

as Go puts the two relevant things together.

Don't get me wrong: I like Python too but I do think Go has a lot going for it and has a bright future.

2
tikhonj 5 days ago  replies      
As usual in these threads about Go, I really wish people would consider Haskell as a nice alternative. A lot of people write Haskell off as "academic" or "impractical", which I feel is not an entirely fair assessment.

Particularly: Haskell is fast, concurrent by design (whatever that means, I'm sure Haskell is :P), typed but not cumbersome or ugly (less cumbersome and ugly than Go's types, even) and--most importantly for this article--does error handling really well.

In a certain sense, Haskell's main method for handling errors is actually similar to Go's. It returns a type corresponding to either an error or a value (this type, naturally, is called Either). This method is special, oddly enough, because it isn't special: Either is just a normal Haskell type so the error checking isn't baked into the language.

Coming from another language, you would expect to have to check your return value each time. That is, you fear your code would look like this pseudocode:

    if isError val1 
then pass val1
else val2 <- someFunction val1
if isError val2
...

However, this is not the case! The people implementing Either noticed that this was a very common case: most of the time, you want to propagate an error value outwards and only do any work on a valid value. So you can actually just write code like this:

    do val1 <- someValue
val2 <- someFunction val1
val3 <- someFunction val2
someFunction val3

then, if any of the values return an error, it gets propagated to the end of the code. Then, when you want to check if you actually got a valid value--maybe in some central location in your code, or wherever is convenient--you can just use a normal case analysis.

Additionally, while the errors do pass through essentially silently, the type checker does ensure you deal with them at some point before using or returning them. If you ever want to get the Int from a Either Error Int, you have to either handle the error case somehow or explicitly ignore it (e.g. with a partial pattern match). The latter option will generate a compiler warning, so you can't do it by accident without being notified.

So the mechanism is simple, but it can also stay out of your way syntactically. So what else is this good for? Well, it's just a normal data type, nothing special; you can use existing library functions with these values. For example, you can use the alternation operator <|> to find the first non-error value:

    val1 <|> val2 <|> someFunction 5 <|> ...

This is often a very useful idiom which would be harder to write with a different way of handling errors. There are more utility functions like this (e.g. optional) that let you make your intents very clear at a high level. The optional function, for example, lets you do exactly what it claims: you can mark a value as "optional", meaning that any error from it will be ignored rather than propagated.

You can also layer on this error-handling logic on other similar effects. For example, there are some types (like LogicT) that represent backtracking search. Combining error-handling with nondeterminism gives you a question: should an error cause the whole computation to fail, or only that particular branch? The beauty is that you can choose either option: if you wrap LogicT with ErrorT (this is just like Either except it can be combined with other types) an error will cause the whole computation to fail; if you wrap ErrorT with LogicT, the error will only cause the current branch to fail. This not only makes it easy to choose one or the other, but also makes it very clear which one you did choose: it's right there in the type system in a very declarative fashion.

Haskell also has a bunch of other advantages which aren't worth going into here. I think anybody looking for something like Go--a fast, high-level, concise, typed alternative to Python--should definitely consider Haskell. While it may not seem so at first, I believe that Go and Haskell are good for a very similar set of problems and so actually overlap quite a bit.

If you really dislike some particular things about Haskell, you should also consider some similar languages like OCaml and F#. I personally prefer Haskell, but there are definitely good cases to be made for either of the other two.

       cached 29 September 2012 04:11:01 GMT