hacker news with inline top comments    .. more ..    22 Feb 2012 Best
home   ask   best   7 years ago   
1
Every day at my job I helped people just barely survive metafilter.com
913 points by blatherard  3 days ago   314 comments top
1
tptacek 3 days ago  replies      
Can I add to this one little point, which is a counterintuitive thing you can do to help your local library:

Use your local library.

During the year we started Matasano our Chicago team spent about 40%-50% of our time working from the Oak Park Public Library, sometimes in meeting rooms we booked, more often at study desks. It was great. The Internet access wasn't amazing but it was totally functional and we could VPN out through it. The desks and working space were if anything better than what we have now (and we really like our office).

There were times I worried that we were being a burden, but the impression I get is that it's the opposite. What's deadly for a local library is for nobody from the community to be using it, for it to have no stakeholders from the tax base of the community. The library staff was always welcoming to us.

Your hip coffee shop on the other hand hates you with a passion it normally reserves for Scott Stapp solo albums. At the coffee shop, you take up space in a business that's driven by turnover. Someone's going to chime in here with a story about a coffee shop that truly loves the startups that park themselves at their tables and order 3 count them 3 cups of coffee in a day, but I've talked to hipster coffee shop people oh-yes-I-have and at least some of you who truly believe you're doing your coffee shop a favor are being tolerated gracefully, not welcomed, like you would be at your local library.

Libraries have an obvious role to grow into as IT hubs for their communities, now that so much of their knowledge-disseminating role has been subsumed by IT. But another related less obvious role is as a hub for local entrepreneurship; thing thing "hackerspaces" are supposed to do, but are (for so many companies) suboptimal at.

2
14 years ago: the day Teller gave me the secret to my career in magic. shwood.squarespace.com
786 points by Sukotto  6 days ago   63 comments top 11
1
kevinalexbrown 6 days ago 1 reply      
It took me a while to appreciate it, but what I enjoyed most about this was the letter from Brushwood to Teller. Teller's letter was awesome in all of the general ways, and he makes great points about about doing something besides the thing you want to excel at.

But Brushwood's email was great because it asked for advice for a specific problem, and it was a problem that Teller had the expertise to answer. Not "Oh Teller! I want to be a great magician just like you!" Not "What are 10 things an aspiring magician should do?" Not "Dear Teller, would you mind sending me information you find relevant or letting me pick your brain over coffee?" It was "I want to develop my own style, here's what I've done to that end. I've had some success, but here's how I struggled with taking it to the next level. What do you suggest, as someone who's accomplished this?" Emails like that tend to get the best kinds of responses.

2
boredguy8 6 days ago 3 replies      
Is anyone else as impressed as I am that Teller remembered the name of an up-and-coming magician they met after a show more than a year earlier? And that he referenced the 'inside joke' with no prompting?

Also, I think folks are missing what seems to me to be the key takeaway: "And if I'm good, it's because I should be a film editor. Bach should have written opera or plays. But instead, he worked in eighteenth-century counterpoint." He's taking something that's well understood in one area and bringing it to an area where that thing is not understood. Sometimes people call that a 'distinctive'. Sometimes it's a 'paradigm shift'. Teller could have been an average film editor, or he could take ideas from film and bring them to magic.

3
Sukotto 6 days ago 3 replies      
I think there are some great points that adapt well to the startup mindset.

* [Ship stuff]. A lot. Try stuff. Make your best stab and keep stabbing. If it's there in your heart, it will eventually find its way out. Or you will give up and have a prudent, contented life doing something else.

* We did not start as friends, but as people who respected and admired each other. Crucial, absolutely crucial for a partnership.

* Have heroes outside of [startups / technology / business]. ... You're welcome to borrow [other people's], but you must learn to love them yourself for your own reasons. Then they'll push you in the right direction.

* Love something besides [technology / startups / business]. Get inspired by a particular poet, film-maker, sculptor, composer. You will never be [ $famous_startup_personality]. But if you want to be, say, the Salvador Dali of [startups / technology / business], THERE'S an opening.

4
alanfalcon 6 days ago 1 reply      
"Here's a compositional secret. It's so obvious and simple, you'll say to yourself, 'This man is bullshitting me.' I am not. This is one of the most fundamental things in all theatrical movie composition and yet magicians know nothing of it. Ready?

Surprise me.

That's it. Place 2 and 2 right in front of my nose, but make me think I'm seeing 5. Then reveal the truth, 4!, and surprise me."

Teller absolutely means this. Watching the Penn and Teller Magic and Mystery Tour, this was the one best scenes in that show or in anything else I watched on Netflix last year:

http://www.youtube.com/watch?v=TgtgOs_OkTU

Yes, Teller is speaking, but that isn't what matters at all. The takeaway for me is that you must KNOW YOUR AUDIENCE! This is a magic trick that would not be a magic trick at all if it weren't performed for someone like Teller, a master of his craft. This scene from an otherwise interesting but forgettable show blew my mind.

5
mattmaroon 6 days ago 1 reply      
Humorous story: I once got yelled at by Teller. I was in Vegas for the World Series of Poker which was held at the Rio, the same casino where their show runs. The previous night I had had dinner at Okada with Penn's wife, Emily, who was a friend of the friend who put the dinner together. I also had tickets the next day to see their show for the first time.

On a break from my tournament I stopped by Starbucks to properly caffienate and saw Penn and Teller sitting in a corner. I went over to introduce myself, as I would to the spouse of anyone I had just met if I somehow recognized them. Right when I got there Teller gave me the evil eye and said something like "Can't you see we're working here?".

Of course they were just sitting at a Starbucks talking. They didn't have any props or anything. And probably they were hashing out details for the show or something, but I still found it quite rude, especially since I know they greet people outside of the show every night afterward.

But mostly I just thought it was funny to get yelled at by a guy most people think is mute.

6
bambax 5 days ago 0 replies      
> Here's a compositional secret. (..) Surprise me.

> Here's how surprise works. While holding my attention, you withold basic plot information. Feed it to me little by little. Make me try and figure out what's going on. Tease me in one direction. Throw in a false ending. Then turn it around and flip me over.

This is the secret of storytelling. Hell, it's the secret of art. Music, painting, writing, architecture, wine making... you name it.

Storytelling > story, BTW. That's why P&T can make new with the same old tricks. If you tell the same old story in a new way, it feels unique.

The surprise is not in the content, it's in the experience.

7
user24 5 days ago 0 replies      
I learned this lesson accidentally.

I was showing a friend a new card trick I'd made up, fanned the cards and asked him to pick one, remember it and replace it into the deck. As he did that I got confused - I thought I'd gone wrong and, apologising, asked him to pick another card. Fanned the cards again and he looked down at them to pick another one. In the middle of the fan, staring right back at him one card was face up; the card he'd chosen the first time.

I hadn't gone wrong at all, the trick had gone perfectly - I was just confused and thought I'd gone wrong when I hadnt. It wasn't the plot line I'd originally had in mind for the trick, but it blew him away because when he went to pick a card the second time he wasn't expecting to see the card he picked the first time. The surprise was a huge contributing factor to the impact of the trick.

8
jasonkolb 6 days ago 2 replies      
I recall reading a study once where the brain waves of people listening to a story literally synchronized with the brain waves of the person telling the story. Quite literally the storyteller was controlling the minds of the listeners.

Apparently we are hard-wired to become entranced by stories, but there is a skill to telling them in a way that doesn't break the trance--or, as Teller talks about, leading people to a place where you can smash the trance in a dramatic way for maximum impact.

Storytelling has fascinated me ever since I read that study. I must go find it again.

9
randall 6 days ago 0 replies      
First: Brian is the man. (Had him on a show I used to be on called The 404, and he had me both laughing and vomiting by the end of the show.)

Second: Teller's advice is echoed by another great... Ira Glass. He talked about how to make it in broadcasting. http://shorty.randallcbennett.com/post/36484012/ira-glass-gi...

The thing with all of this is pretty simple. When you start, you have taste. If you have good taste, you'll know what you're making isn't quite what you want, but it's on the way to being good. If you keep perfecting your taste along the way, you're making positive progress. For startup entreprenuers, that often means starting lots of insignificant companies / working at insignificant companies on your way to perfecting the sense of who you are.

I feel like I'm _just_ starting to know who I am professionally. It's been about 7 years, and now I know that the only thing I want to do is make it so anyone can create TV quality video without a TV quality budget.

Failing in numerous ways has finally led me to figure out who I am. Now, it's time to take what I've learned and actually contribute something to the world.

For Brian, it's kickass magic which people can consume online. For me, who knows what my end "product" will be, but either way, if you don't enjoy the taste refining process, you'll probably both be dissatisfied with your life, but also miss out when you're about to break through.

I hope anyway.

10
moreorless 6 days ago 1 reply      
Brian Brushwood is an amazing magician, but as a person, he is even more amazing. I had the fortune of meeting him last weekend at a common friend's house and saw him take time out to entertain a kid. If you're not familiar with his work, he has a weekly podcast on Revision3.com (http://revision3.com/scamschool).
11
mgallivan 6 days ago  replies      
Why does transferring knowledge across domains have such exponential returns?

Is it only because the time it takes other fields to adapt to new methodology is longer - or are there other factors?

EDIT: Holy crap, I worded this terribly.

Rewritten: Why does going across domains with new ideas have such high returns? Obviously you're a fresh idea but are there other reasons that influence your success?

3
Xcode, GCC, and Homebrew kennethreitz.com
637 points by craigkerstiens  5 days ago   103 comments top 2
1
pavlov 5 days ago  replies      
Unfortunately Apple's new Command Line Tools for Xcode doesn't include regular gcc, but Apple's gcc-compatible wrapper around LLVM (i.e. /usr/bin/gcc is a symlink that points to llvm-gcc).

llvm-gcc doesn't compile many projects correctly, including gcc itself (as of Xcode 4.2 there were reports that it's not possible to compile a gcc cross-compiler using llvm-gcc). Hopefully there have been improvements in this regard.

Still, anyone who needs the real gcc will have to install it through some other means.

2
mapgrep 5 days ago  replies      
While this is a nice convenience for registered Apple developers, I noticed that this download requires me to sign a confidentiality agreement and an NDA as part of the registration process (since I am NOT a registered Apple developer).

The nice thing about just installing the OS X developer tools off the OS X DVD is that you do not need to enter into these sorts of legal agreements with Apple (beyond a standard software EULA).

4
Jotform domain seized by US due to user generated content jotform.net
600 points by Maxious  6 days ago   234 comments top 2
1
mixmax 6 days ago  replies      
If you read through the comments there's a lot of angry users demanding refunds and questioning the service. There's a fair chance that they won't be able to bounce back after this. Especially if the domain doesn't come back up within a day or two.

In other words, this might very well kill a company that someone worked hard to get off the ground. And if you have any usergenerated content it might happen to your company too. Apparently without due process, and without warning.

This is preposterous.

2
ericabiz 6 days ago  replies      
Can we all PLEASE agree to stop using GoDaddy?

This is a GoDaddy thing, plain and simple. They get one complaint--they shut your domain name down by changing the name servers to NS1.SUSPENDED-FOR.SPAM-AND-ABUSE.COM and NS2.SUSPENDED-FOR.SPAM-AND-ABUSE.COM. Exactly what happened here.

This has been going on for at least SIX years now; see http://seclists.org/nmap-hackers/2007/0 and I saw a hosting company shut down for similar reasons a year before that.)

Wasn't their support of SOPA enough? When are we all going to wake up? How many times does this have to happen?! STOP. USING. GODADDY.

5
Online Python Tutor mit.edu
522 points by llambda  2 days ago   45 comments top 18
1
pgbovine 2 days ago 1 reply      
whoa what a pleasant surprise seeing my project on the HN front page this morning! just an FYI: this is a "demo-quality" project, not production-quality. it's probably not ready for HN-level traffic! please email me if you have more detailed suggestions or bug reports, thanks.
2
thefool 2 days ago 1 reply      
This is awesome.

Would be cool to see a version for a lower level language like C that would help people understand the stack and visualize pointer arithmetic.

3
ht_th 2 days ago 3 replies      
Although I like these kind of tools, I doubt that they're useful to learn basic programming as they need a lot of prior knowledge to be present to already understand the visualization. For example, what is a stack or heap to a novice programmer? What will the effect of this visualization be on their construction of an operational understanding of basic programming language concepts? And these constructs, will they be helpful, have no effect, or even be detrimental for developing understanding of more complex programming structures?

On the other hand, in a more advanced course, say at university level, it might help to construct knowledge about how programming languages work internally and how this relates to their already existing understanding of basic programming constructs. So, as a teacher be sure to evaluate the effect of the educational tools you use. I know it is easy to use existing cool tools (and this one definitely belongs to that category), but what is its effect on your students' learning?

4
wilfra 2 days ago 1 reply      
If you want to learn Python, MIT OCW Intro to Programming is probably the best resource there is: http://ocw.mit.edu/courses/electrical-engineering-and-comput...
5
norvig 2 days ago 1 reply      
It is great that Hacker News has caught up with this -- Philip has done a great job with the Python Tutor. Next step: user-definable layout for display of different data types.
6
ilovecomputers 2 days ago 0 replies      
As someone who went through Intro to CS courses. I must say that including the object oriented paradigm of Java in an intro class really confused me for a good three years. It wasn't until I learned how data was organized and executed inside a computer that it all became clear to me what programming really consisted of.

This tool is wonderful. It visualizes the inner workings of a computer very well and how it translates to code. I hope this teaching tool is used on students after they've learned some basic programming concepts and syntax.

7
mckoss 2 days ago 1 reply      
Works great on iPad -- well done!
8
forbes 1 day ago 0 replies      
This is a fantastic tool. I love it.

On line #9 of the example program, the string "hello" is assigned to y. On line #12 another string "hello" is used. I don't know anything about the inner workings of Python, but I imagine that these strings would be stored on the heap and possibly 'interned' by the interpreter. Maybe this is being ignored for simplicity, or I am way off the mark.

9
bp_ 2 days ago 2 replies      
The tool is awesome, but I wouldn't rely too much on the instruction bounds. For example, the greatest sums exercise (http://people.csail.mit.edu/pgbovine/python/question.html?op...) can be solved in six "steps" for all input lengths, while it's certainly no O(1) business.

  def maxPairSum(data):
return sum(sorted(data)[-2:]) # one "step"

10
gahahaha 2 days ago 0 replies      
Does anything like this exist for Javascript?
11
agumonkey 2 days ago 0 replies      
100% cpu usage on chrome 19.0.1042.0 canary
0% cpu usage on firefox 13.0a1 (2012-02-18) until foo/bar execution (100% cpu usage too)

Like it anyway. Great job

12
morenoh149 2 days ago 1 reply      
I need 4 lines to do the mergesort correction. What's the way that is expected here? I would do,

    if i = len(left):
result.append(right[j])
if j = len(right):
result.append(left[i])

13
paufernandez 2 days ago 0 replies      
Thanks a lot! I wish I had something like this for basic C++, my students would love it!
14
vng 2 days ago 0 replies      
This should be insanely useful for someone new to Computer Science. I wish I had this during my undergrad career. Good job!!
15
bwarp 2 days ago 0 replies      
I'd love to see this in debugger form!
16
ya3r 2 days ago 1 reply      
Where was the aliasing?
17
indubitably 2 days ago 1 reply      
unicode breaks it
18
beggi 2 days ago 0 replies      
Really, really cool.
6
Did You Hear We Got Osama? roshfu.com
519 points by choxi  4 days ago   204 comments top 2
1
wheels 4 days ago 11 replies      
This is a topic that I've thought some on in that my company is in the recommendations space, and some thoughts on what make news recommendations difficult:

The function of news is to facilitate smalltalk.

I am reasonably convinced of this. News, as such, is mostly just something that you're supposed to have read so that you can get by in usual social interactions. What we read (and are supposed to have read) is very tied to our place in the social hierarchy. Most folks don't actually want news to be too personalized because then it loses its social function.

I stopped reading the news at one point -- for a couple years -- because of its persistent lack of depth. I realized that reading 100 BBC articles on the Israeli-Palestinian conflict would leave me knowing virtually nothing about it, other than some factoids, whereas the same time spent reading books would be worthwhile.

There's an interesting problem to be solved here -- one that's been on my mental back-burner for a while. I'm not sure if Pandora's box can be resealed and we can work our way back to mediums with more depth and less distraction, but I both hope we can, and have much interest in the mechanism for such.

2
peterwwillis 4 days ago  replies      
It is really, really, really, really, really hard for me to find a good temporary distraction nowadays. The noise is so monotonous, so repetitive, so completely devoid of intellectual stimulation that I go between four websites in a loop looking for something interesting to read.

HN has maybe one article every two hours that pops up to the top that I find worth reading, and maybe half the time worth upvoting. And that's the only good source of noise I have. Everything else is shit.

I don't care about politics. I don't care about the tech scene, or gadgets, or games, or celebrities, or sports, or this quarter's fiscal projections for a multinational corporation. You name a "news" story and I probably would hate to read about it. Even if I want to read it, it has almost no background information or anything more than the re-cutting of a press release with a paragraph describing why the press release was released. Regurgitated stock information with nothing of value.

Here's some choice excerpts from Google News, which I guess is supposed to be some representation of what's happening in journalism today:

  * Microsoft unveils new, more window-like logo for Windows 8
* Robin Thicke Arrested for Pot Possession
* The mostly good and sometimes bad Top 10 moments of Tim Wakefield's Red Sox career
* [John] Glenn worries the US is ceding its space leadership
* Ohio AG DeWine switches from Romney to Santorum
* Identity Theft Tops IRS's 2012 "Dirty Dozen" Tax Scams
* GOP candidates fighting over Michigan
* Anthony Shadid, New York Times foreign correspondent, dies at 43
* FDA Still Wary of Diet Pill's Side Effects

I don't want noise, but sometimes I need noise. And when I want it, I want it to be worth while. It seems like nothing on the internet ever is.

7
Ubuntu for Android ubuntu.com
504 points by dave1010uk  11 hours ago   144 comments top
1
bguthrie 9 hours ago  replies      
This, or something like it, is the future: the computing device is portable, and adapts itself to the forms of input available. There's no reason why your display should have to be permanently attached to the device that drives it, and increasingly, it won't be.

I don't know what the implications are for Ubuntu or Android. But genuine support for a first-class computing experience is one of the few things that would tempt me back onto those platforms.

8
Hacking Hacker News joelgrus.com
495 points by hexis  4 days ago   82 comments top 14
1
jcr 4 days ago 1 reply      
With all due respect Joel, it seems you missed a good number of the
inputs you could have used to train your classifier.

If I (jcr) want to know what classifies as "interesting to Joel"
(joelthelion), I simply look at the "comments" and "submissions" links
in your HN profile. It will show me the stuff you took the time to
comment on, or took the time to submit to HN.

If I want to know what's "interesting" to me, the "saved stories" link
is visible in my own profile even through it is not visible to others.
In the "saved stories" is THE goldmine of every submission I've either
submitted or up-voted.

https://news.ycombinator.com/saved?id=jcr

Depending on your personal bookmarking habits, your bookmarks file/db
can be another useful input. I'm in the habit of bookmarking both the
submitted article, and the HN discussion page (if it's good). Assuming I
didn't book the HN discussion, I could easily find the HN
discussion/submission for all of the sites that I've ever bookmarked
with a search engine. Give google an URL and ask it for all of the sites
linking to that URL, then parse for HN, and you've got the target.

The serious problem I see with your approach was already mentioned by
ck2; you're creating a bubble and will miss out on all the fantastic
stuff that is interesting to you, but you don't yet know that it's
interesting to you.

One of the primary benefits of HN and similar sites is learning about
the things that others find interesting. Those things may not interest
me, but the fact that others find them interesting is, well,
interesting.

Why to they consider it interesting?

Why do I consider it uninteresting?

Even if my personal opinions remain unchanged, these are important
questions for me to keep asking myself, repeatedly.

2
raganwald 4 days ago 5 replies      
Old timers (‘scuse me while I slam a geritol and bourbon) will remember that when reddit first launched, there was a recommendation engine that purportedly took your votes and turned them into a personal page of stories you would enjoy.

It was scrapped and eventually subreddits were introduced. I think people like the idea of communities.

3
thristian 4 days ago 4 replies      
There's a lot of stuff on HN I like, and a lot of stuff I find boring or irrelevant. Unfortunately, the stuff that I really like tends to be novel and unpredictable, so trying to teach some kind of Bayesian classifier to recognise things I'll like is probably not going to work.

Personally, I think I'd be perfectly happy with an old-school killfile: do not show me posts whose headlines contain the strings "X", "Y" or "Z", or that link to sites "A", "B" or "C".

4
ck2 4 days ago 4 replies      
By filtering out stuff, you'll never expose yourself to things outside your "pattern".

HN's subtitle is "Links for the intellectually curious"

I guess HN has a dual purpose to keep people up on "breaking" hacker news, but I like to think of it as "hacker news outside your thinking pattern".

Also why do people immediately go to AWS for testing something? Doesn't a real hacker have their own server handy for experimental projects, or is it only me?

5
m0th87 4 days ago 1 reply      
I applied naive bayes to generic news for a project a few years ago. Counter to some of the comments here, I think it works surprisingly well in filtering articles, and is a great way to start.

One of the nicest aspects of it is that it doesn't support a user's confirmation bias: your perspective isn't taken into account in filtering since it's just looking at keywords. That's probably not as important here, but especially on political news it's highly relevant. If I'm a Democrat, I don't want just left-leaning news to come through the grapevine, because that prevents me from seeing the other perspective.

6
iamgilesbowkett 4 days ago 1 reply      
This is a very rewarding thing to do. I recommend it to everyone. You don't necessarily even need to use AI.

I filter HN with http://hacker-newspaper.gilesb.com/, which pulls RSS, filters it, and reformats it on an hourly cron job. I mainly did it for the typography -- I disagree with just about every visual design decision on Hacker News -- but added very primitive filtering after the fact. I throw out any story from TechCrunch, Zed Shaw, Steve Yegge, and Jeff Atwood, because I just got tired of them, and any story with "YC" in it, too, because I got tired of seeing job ads for Y Combinator startups. (In fact it was the job ad for a Curebits marketing manager right after their scandal that did it.)

When Apple launched the iPad, I went in and added a simple regex to filter out any story about it. Hacker News is a great source for skimming but occasionally gets fixated on topics. I get like a hundred uniques a day so it's not exactly a huge hit, but I've thought about making a commercial version with customization. It got featured on Mashable and somebody created an iPad app which looked very, VERY similar, which I'm going to take as validating my design. But whether or not I ever startupify it, anyone who wants a customized version can just fork the project, deploy their own version in like ten or twenty minutes, and tweak regexes to their heart's content. It's on GitHub (https://github.com/gilesbowkett/hacker_newspaper) and only requires the most basic proficiency with cron, ruby, and python.

I also want to add comment-scanning. Right now I don't use comment links at all. The code extracts them but then simply throws them away. I don't want to add comment links back in unless I can also set it up to alert me if the comments thread contains comments from raganwald, patio11, jashkenas, amyhoy, etc -- basically automated comment elitism. I'm not trying to be a dick with that, I'm just a busy dude.

Anyway, when I set out to do this, I planned on doing a bunch of Bayesian whatnot, but I found that I got most of the way there just tweaking regexes occasionally. Likewise there's a lot of rough edges I could clean up, e.g., text encoding is a bit of a mess, and summarization in the style of http://tldr.it/ would make it way more useful.

But I recommend it because making deliberate decisions about what info you want to get from HN makes it a lot less like watching TV and a lot more like doing actual research into topics which interest you. It's surprising how much more enjoyable HN becomes when viewed through a customized filter.

7
joelthelion 4 days ago 1 reply      
I've been working on this kind of stuff on an off for a while now. I still think it's a great idea, but recommendations are a tricky business and naive bayes is not good enough.
8
Sukotto 4 days ago 0 replies      
Nice article and an interesting approach. thanks for writing it up.

  The model can only get better with more training data,
which requires me to judge whether I like stories or not.
I do this occasionally [using] the above command-line tool,
but maybe I'll come up with something better in the future.

Well, you could analyse your server logs to see which stories you really did click on and which you skipped over (You'll probably want to only consider pages that have at least one click for the edge case of "I didn't even look at that page". Need a cookie or login to make sure it only counts your clicks)

Also consider scraping your HN "saved stories" list as a positive source.

Don't recall if you mentioned it in your article, but you'll probably want to randomly insert the occasional low-scoring article as a check for under-weighting.

I really wish our profile pages supplied a (private) log of up/down votes for comments and flags for stories in addition to the the up votes for stories. It would make for some interesting datamining

9
joelgrus 4 days ago 0 replies      
Hey everyone, thanks for the comments. There's too much to respond to everyone, but a lot of people have brought up an "echo chamber" concern.

As other people have pointed out, the naive Bayes model works topically, so it will learn that I like stories about "patents" but not (usually) whether the stories are pro or anti-patent. It is totally true that I might miss an interesting story about the new OSX or about Pinterest, but I'm willing to live with that.

Two larger points are that

1. HN is only a small fraction of the news I consume, so it wouldn't matter that much to me even if it were a bubble chamber, and
2. The main reason I did this is that I simply couldn't keep up with the volume of stories otherwise.

Last night when I spoke about this, someone asked me whether I was concerned about all the false negatives I was missing. But before I started this, my RSS feed had like 800 (and growing) unread HN articles in it. Reading some of them, even a targeted some, is better than none.

Anyway, thanks for all the comments. I'm suprised (and glad) that people are so interested in this!

10
sounds 4 days ago 3 replies      
Why is HN opposed to scraping? I think since the front page is dynamically generated -- a bot would waste resources. The site gets loaded (slow to respond) during peak times.

Beyond that, only pg could say...

11
switz 4 days ago 1 reply      
Wow this is great! I was actually thinking about something like this the other day. Any chance on releasing the source?

Edit: Found it - https://github.com/joelgrus/hackernews

12
lt 4 days ago 0 replies      
I think a good feature to measure would be the number of duplicates a story has multiplied by the number of days between submissions (normalized somehow).

I have this theory that "atemporal" stories (technical analysis, insightful essays, etc) that keep getting resubmitted every year are more interesting than news about the latest gadget. I've written about it before: http://news.ycombinator.com/item?id=2505081

13
guynamedloren 4 days ago 0 replies      
Another huge advantage of this tool is that by only seeing whats interesting to you, you won't find yourself wading through every single story on the front page of HN during downtime.

Looks like a great start to an awesome project. I think the next logical step would be to expand to give anybody a filtered HN experience, but you probably didn't need me to tell you that :)

14
barmstrong 4 days ago  replies      
Interesting post. This isnt quite the same but wanted to mention i created a project, http://ribbot.com, to let people create their own Hacker News style site on other topics.
9
Why we created Julia - new language for fresh approach to technical computing julialang.org
481 points by jemeshsu  3 days ago   184 comments top 2
1
mjw 3 days ago  replies      
I wonder what they think about or have learned from http://en.wikipedia.org/wiki/Fortress_(programming_language) , another recent-ish attempt to deliver a modern and powerful scientific programming language.

Personally I'm a little wary of being ghettoised into something overly domain-specific for scientific/numerical computing. Really good interop may mitigate that -- something which can navigate the unholy mix of C, C++, fortran, matlab, octave, R and python routines one comes across trying to reproduce others research work, would indeed be awesome.

I do wonder if some of the noble demands of this project might be better delegated to library developers though, after adding a bare minimum of syntax and feature support to a powerful general-purpose language. For now Python+numpy+scipy seems a great 90% solution here.

2
wbhart 3 days ago  replies      
Much praise!! These guys have incredibly good taste. Almost every single thing I can think of that I want in a programming language, they have it. All in the one language!

The fact that it has parametric types, parametric polymorphism, macros, performance almost as good as C, good C/Fortran interop, 64 bit integers and an interactive REPL all in the one language just blows my mind.

I wasn't able to tell if it is possible to overload operators, which is another thing essential to mathematical code.

I was also unsure why the keyword end was needed at the end of code blocks. It seems that indentation could take care of that.

I also didn't see bignums as a default type (though you can use an external library to get them).

However, all in all, I think this is the first 21st Century language and find it very exciting!

10
Apple announces Mac OS X 10.8 Mountain Lion apple.com
456 points by cstuder  5 days ago   356 comments top 3
1
api 5 days ago 2 replies      
I really hope they don't try to converge the interfaces too much. The design elements in Lion that were very iOS-like were IMHO the worst.

The thing is: pads and phones are fundamentally different kinds of devices. Their UI paradigm is designed around frequent but brief and relatively shallow interactions, while a PC is for deeper longer-term interactions. Trying convergence here seems like something very easy to botch horribly.

2
pwthornton 5 days ago 7 replies      
The thing I'm by far the most excited about is AirPlay Mirroring. If I'm reading this correctly, I'll be able to mirror anything from my Mac to my TV using an Apple TV. This will be great for watching Internet videos, going through my iPhoto library with my wife, looking at home videos, etc. All without needing a stupid cable from my couch to my TV.

The deeper iCloud integration is also intriguing. I assume this means proper syncing between documents on a Mac, iPad and iPhone. This will be a huge feature, and is currently a major stumbling block with iCloud. I want my documents to sync to my mobile devices.

Those two features alone are pretty big. Beyond that, I'm interested to here about the new APIs and anything else that has been changed under the hood.

OS X Lion has always been a little bit buggy (the latest release is pretty solid, however). I wonder if Mountain Lion will be like Snow Leopard where it tightens up a lot of things under the hood and gives us a better realization of the OS.

I'm excited for Gatekeeper for my parents and other people I know that aren't that tech savvy. This could save me a lot of headaches.

3
ori_b 5 days ago  replies      
"OS X Mountain Lion arrives this summer. With all-new features inspired by iPad, the Mac just keeps getting better and better."

Am I the only person that doesn't want an ipad on the desktop?

11
How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did forbes.com
451 points by antoncohen  5 days ago   156 comments top 5
1
SeoxyS 4 days ago 3 replies      
For those who want real journalism, here is the original story:

http://www.nytimes.com/2012/02/19/magazine/shopping-habits.h...

It's much better than Forbes' click-bait re-packaging of it, by the way.

2
subpixel 5 days ago 3 replies      
A fie on on Forbes for slapping a click-baiting headline on another publication's professional reporting and hard work.

It's one thing to tweet about it, but to reprint numerous chunks of the original article? Wrapped in your own ads? How is this journalistically sound?

3
wallflower 5 days ago 3 replies      
And then we have Wal-Mart...

> "We were contacted about two years ago by somebody who runs a security company that had been asked in a request for proposals for ways they could link video footage with customers paying for their purchases," Albrecht said. "Wal-Mart would actually be able to view photos and video of customers paying, say, for a pack of gum. At the time, it struck me as unbelievably outlandish because of the amount of data storage required."

http://cryptome.org/eyeball/walmart/walmart-birds.htm

In general, an isolated video is not interesting.

It becomes more interesting and potentially scary once you have a digital, searchable, analyzable history of video customer transactions.

Reduce leakage/theft:

If a cashier has been already flagged and the customer matches up in the network of cashier's friends (Facebook?), possibly in conjunction with another theft deterrent system, be able for managers to watch in real time a potential leakage event (where they don't scan certain items).

Hyper-targeted marketing:

(New) wedding ring detected. Commence deluge of in-kind marketing partnerships with Home Depot, maybe even Crate & Barrel.

Customer over the past three months has been showing signs of possible pregnancy relative to their baseline body mass index. Somehow, non creepily, market to them via 3rd party mailing lists who had no idea how you learned she was expecting or more subtly by changing the default landing homepage of walmart.com to reflect more future mother when her cookie is detected.

Kids. If the kids seem hyperactive in the overhead view, email coupons for toys that appeal to ADD-type kids.

Over the last year of transactions, customer's head has been exhibiting signs of male pattern baldness. Send them targeted coupons for hats to see if they think its something they need to cover up.

4
ben1040 5 days ago 2 replies      
I think the creepy feeling comes in that you don't realize the merchant is profiling you. At least not until they tip their hand a little too much (like sending you an entire booklet of nothing but maternity stuff).

You get a Safeway club card and you know full well that they're tracking you, because you hand them the card every shopping trip. Or you use Amazon or Netflix, where customers really want to be tracked -- the tracking and data mining are part of the draw, because they have such good suggestion engines.

I had LASIK a few years back and bought dry eye drops from Target - a lot of them, over six months time. Then my eyes healed and I quit buying it. A month later I get a coupon generated by a Target cash register for Systane drops, the exact brand I used to buy. I guess they thought I started buying them at Walgreens instead?

It was only when they gave me a coupon for a specific brand and product, that most of the population would likely not buy on a typical weekly shopping trip, that I realized all my purchases were being tracked and linked by my credit card number.

All that time before then, their prediction engine was selecting coupons for completely unrelated products, and I was using the coupons not even thinking about it.

5
SquareWheel 5 days ago  replies      
Maybe it's because I live on the internet and think there's 1024 meters in a kilometer, but I didn't find this surprising or worrisome in the least. I would expect all large stores to perform data-mining, that data is valuable.
12
Youporn.com is now a 100% Redis Site groups.google.com
442 points by potomak  5 days ago   117 comments top
1
antirez 5 days ago  replies      
That's porn for database geeks: 100 million pageviews per day, 300k requests per second against Redis.

The way they use MySQL is also interesting IMHO: they populate a relational database in order to be able to build new indexes in the Redis side, using the relational DB for the stuff it is best at, generating new "views" of the data easily.

(Relational DBs are also good to do a zillion more things of course.)

13
All Web Developers Should Stop Doing This Immediately technologyreview.com
423 points by rberger  5 days ago   123 comments top 4
1
angrycoder 5 days ago 6 replies      
A web developer made that page, but it wasn't his decision to put it there. I am going to go out on a very short limb here and say the web developer was on the few people advocating against blocking tablet users from using the normal website, since they actually understand how the web works.

In fact, when presented with the mandate that they needed a mobile app, the web developer probably just wanted to create a nice HTML5 site that could be served up in any tablet or mobile browser without the need to install a special app for their particular device. Know why? Because that is what web developers do.

2
jwr 5 days ago 3 replies      
Oh yes. We should also mention the horrible broken piece of crap called "Onswipe", that produces a slow, broken, confusing interface that also happens to limit screen space dedicated to reading and breaks built-in Safari zooming.

For some reason people think that they need to have a "Special Tablet Version". You don't! Just have a normal web site, we'll be fine!

The onswipe situation got so bad that whenever I encounter a site using it I immediately click "back" and never come back again.

UPDATE: Oh, I should also mention that half the time when trying to click "back" I actually click the silly Onswipe button in the upper left corner of the screen that pretends to be the back button but actually does something else (takes you to the main blog page I think). Cursing ensues.

3
51Cards 5 days ago 2 replies      
It's really simple, I believe they serve up their videos with Flash which your tablet doesn't support. Flash for video isn't going away yet in many cases because HTML 5 video is still lacking all of the security features sites like this one require. So they did the next best thing, they built a custom app for your platform which gave them the security they desired and you a full user experience.

Edit: Just confirmed I can watch their videos on my Android tablet, thus it's just a lack of Flash that caused this. In fact on my ASUS Slider they are playing perfectly.

4
shrikant 5 days ago  replies      
I didn't see the author of the piece mention it anywhere, but this is clearly because his friend sent him a link to a video which is delivered via Flash. 60 Minutes video segments on the CBS News site don't have any associated text below them. He'd have been better served by his friend sending him the link to the associated article instead.

(...or is this so blindingly obvious that it just doesn't warrant a mention?)

IIRC on the iPad, even YouTube links open the YouTube app. Not sure what happens if you don't have the app installed - does it ask you to download the app instead?

Edit: I suppose one could make the argument that if the CBS News website can serve up different stuff to the iPad, then they might as well serve up what's compatible instead..

14
How Basecamp Next got to be so damn fast without using much client-side UI 37signals.com
418 points by pors  4 days ago   127 comments top
1
jashkenas 4 days ago  replies      
This is all fascinating ... but does it feel like a sustainable approach going forward, or more like a turbo-charged horse and buggy?

Developer happiness aside -- there are plenty of folks who like JavaScript just as much as Ruby -- and assuming that we're talking about an non-public-facing app (like Basecamp Next), is there still an argument that can be made in favor of doing your UI on the server side?

Even just in principle, it can't possibly be close to as fast (client-side can render optimistically, where possible), and it can't be close to as flexible (you don't have a model of state on the client to perform logic with).

I think that the truth of this is already admitted in that the really fancy bits of UI are being implemented in JS: http://37signals.com/svn/posts/3094-code-statistics-for-base...

15
Mountain Lion: John Gruber's personal briefing daringfireball.net
413 points by MaxGabriel  5 days ago   302 comments top 2
1
parfe 5 days ago  replies      
That's when Schiller tells me they're doing some things differently now.

I wonder immediately about that “now”. I don't press, because I find the question that immediately sprang to mind uncomfortable. And some things remain unchanged: Apple executives explain what they want to explain, and they explain nothing more.

What useless reporting! How about just asking a follow up question about what is different? Or why OS X will not support Siri in this release?

Strike out all Gruber's writing involving the feel of the room and this might as well be a feature list on Apple.com.

And instead of a room full of writers, journalists, and analysts, it was just me, Schiller, and two others from Apple " Brian Croll from product marketing and Bill Evans from PR.

Not even a head nod to let us know he has at least a small sense of what his purpose at this meeting is. Just any hint of self-awareness that he's merely an extension of Apple Marketing would suffice. One of the most valuable companies on the planet sees value in free advertising through Gruber. I'm not sure if that reflects more poorly on Gruber as the stooge, or his readers as pawns.

2
beatpanda 5 days ago  replies      
When journalists say they're different from bloggers, what they're talking about is the kind of navel-gazing bullshit that Gruber spends half his time on here " describing the event itself, and his presence and experience there, making sure the reader knows that he was among only a few dozen people invited to this exclusive one-on-one press briefing.

A more competent writer would have spent maybe a sentence or two explaining the novelty of the briefing and then moved on. Gruber spends four paragraphs. I don't need to know that they gave him free coffee, or that the chair was comfortable. The whole thing comes off as sickeningly conceited.

16
Show HN: Bootswatch, free swatches for your Bootstrap site bootswatch.com
396 points by parkov  6 days ago   53 comments top 18
1
quan 6 days ago 13 replies      
Wow, what a coincidence! I just created a very similar project http://www.lavishbootstrap.com/

How did you generate the different themes, and are they stored in different css/less files?

2
stdbrouw 6 days ago 2 replies      
It's interesting to see Bootstrap is slowly turning into what jQuery UI was always supposed to be.
3
wildmXranat 6 days ago 1 reply      
As a developer, I'm not kidding when I say that Bootstrap and related projects made it feel like perpetual Christmas. Thanks mate!
4
jonny_eh 6 days ago 1 reply      
This is very very cool.

It'd be even cooler if users could customize the themes right on the site, but that's probably obvious and not easy to implement.

5
mcobrien 6 days ago 0 replies      
This is great! One minor thing I noticed is that on the homepage, the bottom three swatches don't seem to link to individual files properly -- they all download the bootstrap.min.css, regardless of what you choose in the split button.
6
josscrowcroft 6 days ago 0 replies      
I love the concept - my only problem with it is that the six starter themes (swatches) featured on the site aren't really attractive enough, more like failed experiments... it could benefit from a design-off where popular designers create a swatch with their own take on the Bootstrap, and submit it to the site...

PS: Overheard on Twitter (https://twitter.com/#!/bcherry/status/169960949967626241):

New web dev trend: "Yo dawg I heard you like Bootstrap so I made bootstrap for your Bootstrap so you can bootstrap while you Bootstrap."

7
pinchyfingers 6 days ago 0 replies      
My life keeps getting better. Thanks you, Thomas and Quan, both. I always leave the front-end work for last because design is ridiculously more difficult than coding. Guys like your are easing the anxiety I feel every time I start trying to figure out colors, fonts, layout, etc.
8
thehodge 6 days ago 0 replies      
This is fantastic and just what bootstrap needs to stop people complaining about all the bootstrap sites looking the same.. I hope this stays on the frontpage a while so more people can find out and contribute..
9
rvenugopal1 6 days ago 1 reply      
This is awesome. I presume this is targeted at Bootstrap 2.0. Correct?
One question I have is, how do you plan to deal with changing versions of Bootstrap. Just so I understand, are you recommending dropping the chosen variables.less along with bootstrap less files to compile it?

Thanks

10
arturadib 6 days ago 0 replies      
More and more folks are realizing the power of Open Source - why limit it to functional code when we all need design just as much? This is a fantastic trend, and (I hope) just the beginning.
11
joshmanders 6 days ago 0 replies      
What would be awesome is a way to select all custom colors and build a unique bootstrap swatch right on the site with live preview, like http://bytefluent.com/vivify/ does for VIM color schemes.
12
alanmeaney 6 days ago 0 replies      
Thanks Thomas, this is great. Really wanted to customize bootstrap for a demo site for a project I'm working on and your Spacelab swatch is perfect, saved me loads of time. I'll drop you a mail for a peak when it's live, should be in the next week or so.
13
corroded 5 days ago 0 replies      
Isn't this almost the same as this?

wrapbootstrap.com/themes

which was featured a couple of days ago? Maybe you could submit your "swatches" as themes and earn a couple of bucks :)

14
Blocks8 6 days ago 0 replies      
Nice site. I bookmarked it but it would be cool to sign up for a email when there are new themes available. A very passive way to stay top of mind.

Thanks for creating this!

15
jsavimbi 6 days ago 0 replies      
That's really nice, thank you. All I did was download the file and replaced my existing bootstrap.min.css with it and it works great.
16
creatom 6 days ago 0 replies      
Just awesome, that's exactly what Bootstrap was missing! I hope your project will grow big, good luck!

P.S. That would be even cooler if there were more dark themes.

17
mobytea 4 days ago 0 replies      
great stuff, lavishbootstrap is great too. hope that maybe the kuler color schema or the colourlovers api get inside!
18
tormentor 6 days ago  replies      
Awesome! Soon all our sites will look like bootstrap but now with a variety of different colors. THANKS!! CSS is so hard.
17
Between a rock and a hard place " our decision to abandon the Mac App Store atlassian.com
359 points by jespern  5 days ago   154 comments top 3
1
tzs 5 days ago 3 replies      
Wouldn't security-scoped bookmarks solve some of the problems he describes? From the sandbox design guide:

--------------------------------

Starting in Mac OS X v10.6, the NSURL class and the CFURLRef opaque type each provide a facility for creating and using bookmark objects. A bookmark provides a persistent reference to a file-system resource. When you resolve a bookmark, you obtain a URL to the resource's current location. A bookmark's association with a file-system resource (typically a file or folder) usually continues to work if the user moves or renames the resource, or if the user relaunches your app or restarts the system.

In an app that adopts App Sandbox, you must use a security-scoped bookmark to gain persistent access to a file-system resource.

Security-scoped bookmarks, available starting in Mac OS X v10.7.3, support two use cases:

• An app-scoped bookmark provides a specific sandboxed app with persistent access to a user-specified file or folder.

For example, if your app employs a download or processing folder, present an NSOpenPanel dialog to obtain the user's intent to use a specific folder. Then, by creating a security-scoped bookmark for that folder and storing it as part of the app's configuration (perhaps in a property list file or using the NSUserDefaults class), your app acquires a means to obtain future access to the folder.

• A document-scoped bookmark provides a specific document with persistent access to a file.

For example, a code editor typically supports the notion of a project document that refers to other files and needs persistent access to those files. Other examples are an image browser or editor that maintains an image library, in which the library file needs persistent access to the images it owns; or a word processor that supports embedded images, multimedia, or font files in its document format. In these cases, you configure the document format (of the project file, library file, word processing document, and so on) to store security-scoped bookmarks to the files a document refers to. (A document-scoped bookmark can point only to a file, not a folder.)

A document-scoped bookmark can be resolved by any app that has access to the bookmark data itself and to the document that owns the bookmark. The document can be a flat file, or a document distributed as a bundle.

--------------------------------

2
zdw 5 days ago 3 replies      
Dev tools don't really have a place in managed environments IMO - they just need too low level of access to a system to be able to do their work.

Now, say a game or web browser that runs potentially malicious content, sure, sandbox it. But other things like code interpreters, low level Unix tools, or inter process tools like AppleScript, they're still open to (mis)use by anyone.

I'm going to guess that most malware for OS X will soon become non-compiled scripts. Sure, the interpreter would be signed, but what it runs is totally arbitrary.

3
Ryanmf 5 days ago  replies      
Apple needs to do better than this.

For the duration of the company's existence, one of their biggest customer segments has been the creative industry. I can't think of a single pro audio/video/graphic/etc app that doesn't make extensive use of plug-ins, another Mac App Store disqualifier.

Do the developers of these apps necessarily have a "right" to iCloud APIs, delta updates, and other benefits of playing in Apple's sandbox? Of course not.

But is Apple harming themselves and their customers by excluding the creators of these apps from the party and potentially causing them to focus their development efforts elsewhere? I think they may be.

18
Twitter to move away from Hashbangs storify.com
352 points by ChrisArchitect  1 day ago   77 comments top 8
1
simonw 1 day ago 3 replies      
This is fantastic news.

From the recent tweets by https://twitter.com/danwrong it looks like Twitter are moving entirely to HTML5 pushState, and leaving IE users with full page refreshes rather than continuing to serve them #! - Dan says "I'm not sure why everyone is so adverse to page refreshes these days. You can make them fast too."

Of course, Twitter are going to have to include a piece of JavaScript on the http://twitter.com/ homepage which checks for a #! and redirects the user to the corresponding page - and they'll have to keep that JavaScript there forever, since they have nearly two years worth of links that they need to avoid breaking. One of the many reasons #! is such a nasty hack.

In terms of performance, this is going to make Twitter a lot /faster/ for me - I often open Twitter profile pages in new windows (due to working on Lanyrd) and each new window has to pull in and execute a HUGE chunk of JavaScript before it will display the page. Being able to just load a regular HTML page will be much faster for me.

2
Nitramp 1 day ago 2 replies      
The actual URL for the informative blog post is this: http://danwebb.net/2011/5/28/it-is-about-the-hashbangs

The linked blog post only contains some relatively meaningless Twitter messages and the hyperlink as text, not as an actual link.

One of the things the post doesn't mention (it's sort of implicit in "going under the radar") is that with hash bangs, every request has double the round trip time to retrieve the initial data being displayed, as the server cannot know what data the client wants to retrieve. This makes a lot of nifty performance optimizations impossible.

3
nikcub 1 day ago 4 replies      
Good. A lot of developers justified doing it in their own projects because Twitter and Gawker were doing it. Now that one of the headline sites is no longer using it (and will hopefully condem it) we can file this episode to history and never speak of it again.

Edit: wouldn't it be awesome if Google (they did start this, afterall) would allow sites using hashbangs to auto-update all indexed URls

4
jquery 1 day ago 1 reply      
Twitter's implementation of the hashbang was awful. It broke the back button and it was slow. I don't think it's a fair representation of the technique.

EDIT: And based on their implementation, I wouldn't trust anything their engineers have to say about hashbangs either.

5
guelo 1 day ago 0 replies      
Good. Now if they would get rid of minified urls they would be done with their damage to the web.
6
technomancy 1 day ago 0 replies      
I've been using the mobile version on my laptop for a few months now since the hashbang was so sluggish, but this could get me to switch back.
7
protospork 23 hours ago 0 replies      
YES!
For anyone with a high latency (huge swathes of the US are still stuck with satellite or mobile and the tech industry seems to have ignored this), twitter is a nightmare. The first pageload only pulls the empty 'framework' page, then a series of js requests pull the information. You can't walk away while it loads, either, because it will register the latency and display errors instead of content.
8
kylemaxwell 1 day ago  replies      
Can somebody explain to non-web-types why this matters, other than making the URL itself look cleaner?
19
Stop Paying Your jQuery Tax samsaffron.com
350 points by sams99  5 days ago   83 comments top 7
1
alecco 4 days ago  replies      
Please go check out StackOverflow's source code before bandwagoning on this topic.

This is shifting blame to the tools. The problem lies on StackOverflow's lacking design and not in jQuery.

Pushing jQuery to the bottom of the page is trivial if you do proper HTML architecture. Pages should have only HTML. JS files only JS. JS files referenced at the very bottom of the body. Very simple. (if your asp/c#/* framework doesn't make it easy, blame the framework and not jQuery)

Inlining JS and even having JS code in the title attributes is ridiculous. Fix your HTML and only then you get the right to say anything. Another telling gem is StackOverflow page doesn't even have a proper encoding declaration.

You don't need to use jQuery for everything. This is very typical of developers coming from backend who don't take web development seriously. Use CSS as much as possible.

Another misplaced attack is refresh, with proper cache headers it should not take that long. If some browsers are slow and don't keep a pre-parsed cache, blame the browser vendor and not jQuery.

jQuery taking 80ms on mobiles is quite OK. If you really care about mobile make a page optimized for mobile and minimize JS rendering and styling.

I absolutely love StackOverflow and it's one of the best things to happen to programmers in the last few years. But this self-righteousness attack on a very important tool is very misleading and ungrateful.

Edit: the proposed "solution" of catching $.ready and later calling those is insane.

2
pragmatic 4 days ago 1 reply      
This linked article is good:
the Performance Golden Rule http://www.stevesouders.com/blog/2012/02/10/the-performance-...

(submitted days ago but got no traction)

For years we have been told that we will have to wait on the database so it doesn't matter how fast your implementation language is...

Well it's half true, depending on what you consider your implementation language: server side or client side.

Front end performance matters just as much (or by this article) more than "backend."

3
sequoia 4 days ago 0 replies      
I'm very confused... why aren't the scripts in the footer? Because of inline javascript? What problem is this solving? What does this have to do with jQuery in particular? I'm very confused...
4
EGreg 4 days ago 2 replies      
That is a great point!

An interesting question is, why not just put ALL scripts at the end of the <body> tag, after the HTML of the page has loaded and the CSS probably did as well?

The only thing I can think of is if you have code ON the page which uses these scripts. But why not just put that code at the end of the page, too?

5
chokma 4 days ago 0 replies      
The Grails framework has a plugin to organize resources like JavaScript - http://grails-plugins.github.com/grails-resources (scripts will by default be added to the end of the page and even if your page consists of several template files which depend on jQuery etc, they will only be included once).
6
jackmoore 4 days ago 1 reply      
This is clever. However, it depends on scripts using the shorthand notation for jQuery's ready method. Maybe it would be better form to use something explicit instead rather than doubling up $ operator.
7
VMG 4 days ago  replies      
Nah, I'm going to pay the tax rather than optimize prematurely.

But I'll add that to my bag of tricks when I start optimizing.

20
The machine's view of time, if nanoseconds were seconds plus.google.com
349 points by rictic  6 days ago   64 comments top 10
1
kirubakaran 6 days ago 3 replies      
I liked this better:

L1 - You've already started eating the sandwich, and only need to move your mouth and take another bite. (2 seconds)

L2 - There is a sandwich on the counter, so you need only find it, pick it up, and begin eating. (10 seconds)

RAM - You're near the fridge, but you need to open it and quickly throw together a sandwich. (3 minutes)

HD - Drive to store, purchase seeds, grow seeds, harvest etc. (1 year)

http://www.reddit.com/r/programming/comments/90tge/hey_rprog...

http://news.ycombinator.com/item?id=702713

2
nirvana 6 days ago 5 replies      
I think this is in error. The delivery times for SSDs are correct, if you only consider the periods when the SSD is working. When the SSD fails, the delivery time comparison is the AGE OF THE SUN. Ok, I kid. You never get your data. So, lets call "Age of the sun" an average between really fast and infinity.

I've owned 3 SSDs and have had 2 failures, so far, over 2 years[1]. In the past 20 years, I've owned around 100 hard drives and have had only 4 failures.

This is the achilles heel of the SSD for me. I've gone back to spinning rust because I need the reliability more than I need the performance.

The performance was nice, very nice. But having to restore from backup is something that I do not like doing every year. I'd like to do it once a decade or less.

Until then, I'm no longer using SSDs.

I did a bunch of research into why SSDs fail and inevitably it seems to be software bugs due to the SSDs being clever. I suspect the Samsung SSDs that Apple uses are not clever and thus do not fail. I will use an SSD if it comes with an Apple warranty. But I had an intel SSD fail, and I had a Sandforce based SSD fail. Both catastrophically with zero data recovery (fortunately I had backed up, though in both cases I lost a couple hours of work for various reasons.) In both cases, near as I can tell, the SSD had painted itself into a corner-- it actually hadn't been used enough to have flash failures sufficient to be a problem, let alone in excess of the extra capacity set aside. Nope, it was a management problem for the controller that caused the failures. These kinds of problems can be worked out by the industry, but give that the market has existed for 3-4 years now and we're still having these kinds of problems, I'm going to wait before trying something clever again.

[1] The one that is still working is in my cofounders machine, and I'm dreading the day that it too fails. I am afraid it is just a matter of time, and as soon as we can reshuffle things they'll be using spinning rust again as well.

3
jgw 6 days ago 1 reply      
Cool analogy. Makes a great reference point.

As an ASIC guy, I like to occasionally casually mention to software guys that at 3GHz, light travels about four inches in one clock cycle, and it frequently really blows their minds.

4
buff-a 6 days ago 5 replies      
OCZ Vertex 3's have been pounded for reliability problems[4], so much so that they've just started a special deal on Newegg [2]. And coincidentally, I'm sure, a jolly story about "a machine's view of time" replete with olde-worlde charm, shows up on the front page of a major tech site, and oh, by the way, let me end by saying "I use OCZ Vertex 3's"...

Tom's Hardware suggests that Crucial's m4 series are faster than OCZ Vertex 3's, and don't come with a horrendous approval rating. A 256Gb m4 is $319 on newegg [1].

Intel's new 520 SSDs appear to have given them a proper SSD instead of the floppy-disc-like performance of the 510.[3] Though its $499 for 240gb. [5]

All drives have failures, and while it sucks to be the one that gets the dodgy drive, there will always be someone who can post "it didn't work for me". However, the OCZ Vertex have an unusually high number of "It didn't work for me" reviews. Is it a stitch-up? It'd be easy for "a motivated third party" to buy 27 drives off newegg and post negative reviews. It'd also be in OCZ's interest to fan the flames of doubt on the SF2281 as they are releasing new SSDs based on their own, newly-purchased, Indilinx controllers. But taking off the tin-foil hat, it does look like Vertex 3's have problems.

[1] http://www.newegg.com/Product/Product.aspx?Item=N82E16820148...

[2] http://promotions.newegg.com/OCZ/022912/index.html?cm_sp=Cat...

[3] http://www.tomshardware.co.uk/ssd-520-sandforce-review-bench...

[4] http://www.newegg.com/Product/Product.aspx?Item=20-227-707&#...

[5] http://www.newegg.com/Product/Product.aspx?Item=N82E16820167...

5
dazbradbury 6 days ago 0 replies      

  And yet, if you can wait three years for the first wooden
boat, it can often be at the head of a convoy which will
keep you busy for many thousands of years, sometimes even
orders of magnitude more if you take a minute to request
that another convoy sets out.
- James Gray

I was going to make a point about random access of one bit vs. sequential access of large portions of data, but the comment from google+ above summed it up perfectly.

Thanks for posting. A very insightful analogy, really putting things into perspective.

6
scott_s 6 days ago 0 replies      
I've done used this analogy in reverse. My roommate was also a CS PhD student, and I explained that when it comes to toilet paper, we can't afford to let cache misses go to disk.
7
daeken 6 days ago 0 replies      
Wow. I've been doing low-level work where I have to intimately understand computer architecture and optimization work where every nanosecond counts for as long as I can remember, but I've never put it into perspective. This is awesome.
8
zackzackzack 6 days ago 1 reply      
I really liked this. This is the first instance of a time scale for computing that really made sense for me. It's a really good mental metaphor that cleared up how computers work in a way to this script kiddie.

Extending that thought to multiple cores/threads. Comparable to a small business in a way? You have one guy who can go tell other people to do certain tasks. They take anywhere from a few minutes to a few hours. You can set it up so that there is a task list of things for people to do so that you don't have to continually reassign each one, just tell them to pick up the next thing to do. It's much harder and requires more organization, but ultimately, like the division between a small business and a one man show, you get more shit done with multiple people/threads/cores working in parallel than one single unit working by themselves.

Thanks for posting this.

9
phreeza 6 days ago 0 replies      
To me this evoked the image of a monk doing work in a monastery with a huge library. If you picture this monk as your CPU, and assume he works 12 hours every day, over a lifespan of 60 years, thats about the work a 1GHz CPU is capable of doing every second.
10
martin_k 6 days ago  replies      
Nice analogy. From a technical standpoint, however, access patterns often make a bigger difference than the type of your storage device. The difference between sequential access on disk and sequential access on SSD isn't nearly as big as random access on disk compared to sequential access on disk.
21
Chrome connects to three random domains at startup mikewest.org
346 points by tbassetto  3 days ago   54 comments top 6
1
subwindow 3 days ago 4 replies      
This has some interesting ramifications- some network security appliances (I work for a company that makes one) look for suspicious sets of DNS requests that match the Domain Generation Algorithms that malware like Conficker use to find a command and control server.

These "random" requests look almost exactly the DGA for Murofet- a Zeus variant. This has caused some problems for us (and other vendors, I would assume) in the form of massive numbers of false positives. In short, it's been kind of a PITA.

I wish they wouldn't do this, but it is definitely a tough problem to solve and I can't think of a better approach off of the top of my head. The ultimate culprit is the ISPs that return an A record for a DNS request that really should return NXDOMAIN. These ISPs are essentially breaking the Internet, and we're all just scrambling to put band-aids in place to get it to work again.

2
cydonian_monk 3 days ago 3 replies      
I noticed this two years ago, and eventually had to stop using Chrome (and Chromium browsers) as it would result in my ISP blocking all DNS requests from my IP (for a seemingly random amount of time). Even requests to public DNS would fail. It took awhile to identify Chrome as the culprit, and I wasn't convinced after seing the bogus DNS requests in TCP traps. So I started a cycle of using/not using Chrome, and it became obvious.

The easy solution was to stop using Chrome. The hard solution was to move. I've done both, but have yet to start using Chrome again.

3
kamechan 3 days ago 1 reply      
if you go to chrome://chrome-urls/ and then down to net-internals, you can get a pretty transparent view of exactly what chrome is doing. plus there's a bunch of other stuff there too.
4
unseen 3 days ago 4 replies      
Neat feature. In my opinion, they should go even further and make use of the information and inform the user, too:

"Your ISP (network administrator, ..) is intercepting and manipulating DNS requests. Do you want to use Google DNS instead?"

Comes in real handy when we need to have a "DnsManipulationDetector" in the future that checks if your DNS is actively censoring...

5
clsdaniel 2 days ago 0 replies      
Interesting, yesterday I moved to a local caching DNS server (much faster and more reliable than what isp provided), however I forgot to set it to run at start up, first thing I do today is try to get to my gmail account, everything is working ok then I try to enter another site, dns lookup error, then I remembered I had named turned off, which means that chrome may have hardcoded google sites ip addresses or is caching them (which is less probable the other site didn't work).
6
iRobot 3 days ago  replies      
.. and a really great reason for using open source programs
22
Our unrealistic views of death, through a doctor's eyes washingtonpost.com
341 points by llambda  2 days ago   250 comments top 3
1
coolestuk 2 days ago 2 replies      
"At a certain stage of life, aggressive medical treatment can become sanctioned torture. "

I just went through a month with a 90 year old friend whose life ended almost exactly like the story in that story. He had one lucid 30 mins when I was there and his family was there (he had been a general physician for almost 50 years). In his brief period when he had the energy to try and communicate whilst almost totally paralysed, it was clear he was telling the attending doctor that he wanted them to stop all medication and let him die. His own family could not face that fact, and said they'd ask him again the following day (unfortunately the cowardly doctor backed them up on this). He was never again lucid or strong enough to insist that treatment be stopped. He lived for another 10 days, struggling to breathe, almost totally paralysed, unable to control his bowels.

This was a man who when I last went on holiday with him at the age of 85, he insisted on carrying his own suitcase and refused a wheelchair, even though he had trouble walking and had blood pressure and angina problems.

I don't blame his family not being able to make that decision (it's so hard to let go of someone one loves). But his last weeks were undoubtedly torture, and they know they refused to follow his wishes. It was just terribly sad and an awful dilemma.

I was really glad of something else I read on HN about 6 months ago, where a doctor had a brain tumouur (or something like that) and instead of treatment, he lived out the remainder of his life doing the things he loved. I think that idea was what meant I could come to terms with the need to respect my friend's last wishes. I just could not convince his family.

2
bradleyland 2 days ago 4 replies      
At first, I was confused by this statement, given the data..

> "...modern medicine may be doing more to complicate the end of life than to prolong or improve it"

    1900  
65 -> +12
85 -> +4

2007
65 -> +19
85 -> +6

The engineer in me said, but we've improved! But then I realized that evaluating life by measuring in years is like reviewing tech products by looking at spec sheets.

"But it has more megapixels!? Aren't megapixels what we want?"

Reality is far more subtle.

3
mistercow 2 days ago  replies      
> our culture has come to view death as a medical failure rather than life's natural conclusion.

Death is a medical failure, just like our inability to cure herpes is a medical failure. That there's no way to overcome the failure yet does not imply that it is not a failure.

23
* { box-sizing: border-box } FTW paulirish.com
335 points by tambourine_man  5 days ago   84 comments top 12
1
tambourine_man 5 days ago 5 replies      
As it's been said in the comments, it's amazing that after years of lobbying IE to change its box model to match W3C, we realize that maybe IE's model made more sense after all.
2
josscrowcroft 5 days ago 1 reply      
This article was an eye-opener - I write CSS all the damn time and hadn't even clocked box-sizing yet.

I don't wanna know how many hours I've spent calculating widths-after-padding(-but-wait-it's-different-on-both-sides) and commenting the CSS so other developers know why this element is width: 169px even though the container is 200px...

Not to mention:

    textarea { width:100%; padding:20px; oh shit. }

3
RyanMcGreal 5 days ago 1 reply      
I can't tell you how much I enjoyed doodling in the margins while I read that article.
4
shocks 5 days ago 2 replies      
Because I know we all wanted it.

http://jsfiddle.net/gSD94/2/
http://jsfiddle.net/gSD94/2/embedded/result/

(great article too!)

edit: better link (fullscreen!)

5
aridiculous 4 days ago 0 replies      
Good god, it's about time this hits the mainstream dev community.

The W3C box model is IMO one of the worst design flaws in front-end dev. It's a model that doesn't follow that of a real box! When you have padding (stuffed newspaper, peanuts, etc) in a packing box, the actual width and height of the box doesn't change! Border-width(or the thickness of the cardboard) is also included in the box dimensions.

I've only heard very weak reasons in the past for the W3C version and I'm surprised we haven't gotten past it. I think border-box will become more important as we head towards cross-platform responsively designed apps and sites with tons of layout decisions to consider. With simple sites up until now, you could afford the loss of control or the cognitive overhead to figure out workarounds. But now, I don't want to think the unintuitive way every time I'm making a layout decision for different screens.

6
jhummel 5 days ago 0 replies      
As I pointed out on twitter http://twitter.com/#!/jhummel/status/169561232045649921 it seems that the * selector doesn't apply to pseudo elements. If you're going to take this route it might be a good idea to include *, ::after, ::before { box-sizing: border-box } to make sure everything is being sized the same.
7
rradu 5 days ago 0 replies      
Somewhat related is the ability to do calculations in CSS3, allowing you, for example, to subtract pixels from percentages.

http://www.w3.org/TR/css3-values/#calc

8
X-Istence 5 days ago 0 replies      
I wrote about the same CSS with a small plea that CSS 3 be ratified and implemented across all browsers as quickly as possible:

http://personal.x-istence.com/post/2010/04/25/css-3-needs-be...

I was working on my portfolio site (link in my profile) and was frustrated with the CSS layout rules. It makes a lot more sense (at least to me) for the way it works with box-sizing: border-box than what it was previously.

If I were not afraid of having my portfolio site also easily accessible by HR in various different companies (many still with IE 6) I'd have used the experimental tags. Instead I used a work-around.

9
54mf 5 days ago 3 replies      
Incredible. I wish someone had told me this a few years ago, would have saved me a ton of grief. Jeremy Keith said it best in the comments: "box-sizing: border-box is the bee's knees."
10
jenius 5 days ago 1 reply      
I love you Paul Irish. I was just complaining about this exact same issue about 3 days ago and when I found this article I was like YES HE AGREES WITH ME!

Especially when dealing with fluid layouts, this guy is a lifesaver

11
robgibbons 5 days ago 2 replies      
Where the traditional box model starts to make sense is with fixed-size elements, such as images. Adding padding to an element shouldn't alter the dimensions of the contents within the element.
12
atacrawl 5 days ago  replies      
I worked on a project recently where the previous designer did this and it was absolutely maddening. Switching the box model can be useful sometimes, but it makes no sense to me why you would do this globally.
24
Essential JavaScript Design Patterns For Beginners addyosmani.com
327 points by mfalcon  4 days ago   35 comments top 11
1
oinksoft 4 days ago 4 replies      
This is a good, if long-winded document. At over 100 pages, it will take a dedicated reader to sift the gold.

As other patterns covered display aspects of DRY-ness with JavaScript, let's take a look at how to write DRY code using jQuery. Note that where jQuery is used, you can easily substitute selections using vanilla JavaScript because jQuery is just JavaScript at an abstracted level.

That is jibberish, and it shows in the code examples: Using an ES5 shim is best-practice these days, so that you can use Array.prototype.forEach and not jQuery.each (with its weird argument order), amongst other things that are commonplace in modern JavaScript code. Yet the auther defers to jQuery throughout.

As jQuery is accepted as one of the best options for DOM-manipulation and selection, we'll be using it for our DOM-related examples.

More likely that the document is filled with jQuery because the author is heavily involved with the jQuery project, no?

2
raju 4 days ago 1 reply      
One book that I found to be really good was 'JavaScript Patterns' by Stoyan Stefanov. Great book, though I am not sure how much this article overlaps with the book.

[http://www.amazon.com/JavaScript-Patterns-Stoyan-Stefanov/dp...]

3
danso 4 days ago 0 replies      
On just length alone, if this is the beginner's version, would love to see the intermediate/advanced versions ;)

I like the references to Backbone.js and other modern frameworks...the explanation of Backbone's router vs Spine's controller was particularly helpful, mostly because I know Backbone's router is not a controller but hadn't taken the time to see how Spine does it.

I thought the introduction...everything before the actual examples/cases, was a little too long-winded for me. It was interesting to read as an experienced programmer, but I can't imagine a beginner trudging through all of that without seeing a simplified use case to break up the long narrative text. If the audience for this book is beginners, why go into such great detail about the philosophy of patterns (including a discussion of antipatterns) when most beginners are at the level where they probably don't know much OOP or even things like closures?

Otherwise, another fantastic addition to the open-source bookshelf.

4
karterk 4 days ago 0 replies      
There is a better way to do the mixins in JS. See this awesome post by Angus Croll:

http://javascriptweblog.wordpress.com/2011/05/31/a-fresh-loo...

`this` in JS generally incurs a lot of wrath, but this is one cool hack with `this`.

5
hebejebelus 4 days ago 0 replies      
I think you can shave off the "for beginners" from the title. It's really worth the read no matter your skill level. There are so many interesting and different ways to use JavaScript that I think almost everyone can gain something from this.
Really enjoying it so far :
6
goredho 4 days ago 1 reply      
Dunno, seems to be a regurgitation of GoF patterns along with some JS-specific ones. I'd rather see a link to the GoF patterns, with a concentration on the JS-specific ones. Shouldn't articles be DRY as well?
7
bzalasky 4 days ago 1 reply      
The revealing module pattern is one of my favorites for writing jQuery plugins. This is a great reference for front-end developers once they've gotten good enough to be dangerous with JavaScript. I stumbled across this a while ago after writing my first non-trivial mobile web app from scratch with JavaScript, and it made everything click.
8
nedludd 3 days ago 0 replies      
Wow, this will be hard for a beginner to process. Unless that beginner has a week to read it, and has a good background in programming languages and design patterns. And then he probably won't need to read this.
9
TheRevoltingX 4 days ago 0 replies      
Freaking amazing, I've been utilizing MVC heavily to develop HTML5 games. Mainly borrowing ideas from Android and iPhone development with their layout engines and staying out of the way as much as possible.
10
tmanderson 4 days ago 0 replies      
To be specific, jQuery is a utility library with a large set of DOM utilities. Sizzle is actually the DOM workhorse.

As for the article: it's a valuable resource. Especially for those that believe jQuery is JavaScript. It's a goodie.

11
yread 4 days ago 0 replies      
Wow and I thought I knew Javascript
25
How Mailinator compresses its email stream by 90% mailinator.blogspot.com
316 points by zinxq  12 hours ago   37 comments top 14
1
jrockway 10 hours ago 3 replies      
Mailinator is a great product. My favorite part about it is how whenever I register for something, the clever form validation software always rejects the mailinator.com email address. Then I visit mailinator, see their alternate domain name du jour in image form (so bots can't harvest it, hah!), and then that works perfectly. It makes me giggle with joy every time I do it.

It's also nice not receiving ads in the mail every hour of every day just because I wanted to try some new YC-startup's product.

2
ShabbyDoo 5 hours ago 0 replies      
I recently worked on a project where, to cut down on space, I built a custom "compressor" for lists of tree-like records. You might think of a Person record with N past addresses although this was not the actual domain. No records were cross-linked (at least not formally in the code) and the records were trees, not more general DAGs. The data contained a lot of enumerated data types (address type, gender, etc.). I didn't really care about the space usage for 1K records, but I cared about 1M. I used variable length encoding (a hacked-up subset of the Apache Avro project) for integers to take advantage of most of them being small in the datasets. Lots of lookup tables for enumerated values and commonly-repeated in practice string values. Implicit "keying" based on a record schema to avoid specifying key identifiers all over (our data was not very sparse, so this beat out a serialized pattern of KVKVKV etc.). I thought about taking advantage of most strings having very limited character sets and doing Huffman encoding per data element type, but the results were good enough before I got there. A co-worker also noted that, because parts of these records were de-normalized data, subtree pattern recognition could provide huge gains. I added some record length prefixes to allow individual records to be read one-at-a-time so that the entire dataset would not have to be read into memory at once. IIRC, compression speed was 2-3x gzip(9), and large record sets were 1/10th the size of using Java serialization plus gzip. [Yes, Java serialization is likely the worst way to serialize this sort of data]

Was all of this worth it? It solved the problem of not burning through network and memory, but it was a local optima. The root problem was that this data came from another system which did not provide repeatable reads, and providing them would have been a massive effort. However, our users wanted to meander through a consistent data set over the course of an hour or so. To provide this ability to browse, we throw these records into a somewhat transient embedded H2 DB instance. The serialized format is required primarily to provide high availability via a clustered cache. In retrospect, I would have pushed for using a MongoDB-esque cluster which could have replaced both H2 (query-ability) and the need for the the serialized format (HA).

It surprised me that there were no open source projects (at least Java-friendly ones) which provided compression schemes taking advantage of the combination of well-defined record schema and redundant-in-practice data. Kyro (http://code.google.com/p/kryo) comes closest as a space-efficient serializer, but it treats each record individually. Protobufs, Thrift, Avro, etc. are designed for RPC/Messaging wire formats and, as an explicit design decision (at least in the protobufs case) optimize on speed and the size of an individual record vs. the size of many records. The binary standards for JSON and XML beat the hell out of their textual equivalents, but they don't have any tricks which optimize on patterns in the repeated record structures.

Is this just an odd use case? Does anyone else have a similar need?

3
markbao 10 hours ago 1 reply      
Great read. I wish there were more articles like these.
4
davesmylie 7 hours ago 1 reply      
I run a similar (though waaaay less popular) site. My mail is stored on disk in a mysql db so I don't have quite the same memory constraints as this.

I had originally created this site naively stashing the uncompressed source straight into the db. For the ~100,000 mails I'd typically retain this would take up anywhere from 800mb to slightly over a gig.

At a recent rails camp, I was in need of a mini project so decided that some sort of compression was in order. Not being quite so clever I just used the readily available Zlib library in ruby.

This took about 30 minutes to implement and a couple of hours to test and debug. An obvious bug (very large emails were causing me to exceed the BLOB size limit and truncating the compressed source) was the main problem there...

I didn't quite reach 90% compression, but my database is now typically around 200-350mb. So about 70-80% compression. So, I didn't reach 90% compression, but I did manage to implement it in about 6 lines of code =)

5
hello_moto 9 hours ago 0 replies      
Both blogs (mailinator and paultyma) are awesome. I need more stuff like this than a typical Web 2.0, how we use NoSQL, and Cache-Everything (without a clue how to do cache properly, but 37Signals cache solution is by far in line with mailinator techniques: smart and elegant).
6
Maascamp 10 hours ago 0 replies      
Great write up. One of the more interesting things I've read on here in a while. Thanks for sharing.
7
pkulak 9 hours ago 1 reply      
Redis works great as an LRU cache and is much more space-efficient than an in-process LinkedHashMap, especially when the keys and values are small. Plus, an LRU wreaks havoc with the the Java generational garbage collector as soon as it fills up (every entry you put in is about guaranteed to last until the oldest generation, then likely be removed).
8
funkah 9 hours ago 0 replies      
Mailinator, even considering any praise it has ever gotten, is still one of the most underrated tools on the internet. I love it and use it all the time.
9
newman314 10 hours ago 1 reply      
Reading about another algo (Locality Sensitive Hashing) as referenced in the first comment.

http://www.stanford.edu/class/cs345a/slides/05-LSH.pdf

10
pbiggar 9 hours ago 1 reply      
In an aside he mentions you should use bubblesort instead of quicksort for small arrays, due to cache locality, etc. I'd recommend using insertion sort instead of bubblesort - it does much better in both cache locality and branch performance (one branch prediction miss per key).
11
wolf550e 4 hours ago 0 replies      
1. I think the author calls DEFLATE an LZW algorithm. It isn't.

2. Has the author looked at Google Snappy? It does 500MB/sec.
http://code.google.com/p/snappy/source/browse/trunk/format_d...

There is a pure-C implementation that might be easier to port:
https://github.com/zeevt/csnappy

12
steffes 9 hours ago 1 reply      
Just when I thought I knew everything there is to know about compression algorithms, along came Pauli, and Voila, mind now blown.
13
dredmorbius 10 hours ago 1 reply      
Further efficiencies can be gained by removing extraneous apostrophes from possessive "its".
14
iag 10 hours ago 0 replies      
Reading this article makes me giggly inside.
26
From the IE Team: Google Bypassing User Privacy Settings msdn.com
306 points by ecaron  1 day ago   172 comments top 5
1
nostromo 1 day ago 4 replies      
This seems to be a problem with the design of P3P more than anything.

Browsers: "3rd-party cookies are blocked unless you add a P3P header..."

Websites: "Ok. What should be in the header?"

Browsers: "Anything... it doesn't matter. Just add the header then 3rd-party cookies are fine"

Websites: "Ok, we'll just add a P3P header saying 'Ceci n'est pas une P3P header' then. Problem solved."

2
gyardley 1 day ago 3 replies      
Google really should have the cojones to stand up and state their actual position plainly, which as far as I can tell is this:

"If you haven't taken an active, positive step to block our +1 buttons, we're going to assume you don't really care and we'll do whatever we can to show them to you, no matter what your browser's default settings are. Why? Well, because we think the default settings are bullshit, and 99 times out of 100 they're only that way because they're the default. They don't reflect actual user preferences, they reflect other browsers messing with our business plans."

Not only is that an intellectually honest position, it's a lot more accurate than assuming all IE users who haven't changed their settings don't want +1 buttons.

3
yanw 1 day ago 3 replies      
So the WSJ publishes another one of it's alarmist articles about Google and Safari during the weekend and Microsoft wants to capitalize by pretending it just now discovered that P3P (a defunct and shitty protocol) is useless and no one uses it.

NYT September 17, 2010:

http://bits.blogs.nytimes.com/2010/09/17/a-loophole-big-enou...
If you rely on Microsoft's Internet Explorer's privacy settings to control cookies on your computer, you may want to rethink that strategy.
Large numbers of Web sites, including giants like Facebook, appear to be using a loophole that circumvents I.E.'s ability to block cookies, according to researchers at CyLab at the Carnegie Mellon University School of Engineering.
A technical paper published by the researchers says that a third of the more than 33,000 sites they studied have technical errors that cause I.E. to allow cookies to install, even if the browser has been set to reject them. Of the 100 most visited destinations on the Internet, 21 sites had the errors, including Facebook, several of Microsoft's own sites, Amazon, IMDB, AOL, Mapquest, GoDaddy and Hulu.

Google doesn't support a broken feature that is exclusive to IE somehow it's their fault. If anyone ever doubted Microsoft's PR sleaziness and propaganda tactics that blog post is proof.

4
jtchang 1 day ago 4 replies      
P3P is a load of garbage as it is implemented/written.

There is no real enforcement behind it and it just causes lots of confusion. Seriously I have to go lookup what each of these acronyms are in order to figure out how my privacy is being violated? What guarantees do I even have that you are obeying P3P and not simply sending it to make me feel good.

Hell while we are at it we should implement P3P for phone apps. I'm sure Path (and others) will stop uploading your address book if the P3P says "ADDRBKNOUP"

5
kylemaxwell 1 day ago  replies      
I've been a Google fanboi for years and defended them in the public square when they've been accused of nefariousness. But these revelations of intentionally ignoring users' privacy settings have shaken me. Maybe it's time to put them into the Facebook category, where I removed my account years ago.
27
Open Source VLC media player 2.0.0 is out videolan.org
291 points by jbk  3 days ago   98 comments top 8
1
jerrell 3 days ago 3 replies      
Wow. I cannot believe the pettiness of comments here. VLC is a fantastic media player, and I'm quite appalled to see the conversation here dominated by such little gripes. VLC plays media more reliably than any other program I've tried, on Windows, Linux or Mac OS X. And whether or not you consider it perfect, reaching the 2.0 milestone is something to be lauded, not bitched about.
2
dmix 3 days ago 4 replies      
Still doesn't have an option to remember playback position. Users have been asking for this on the forums for years.

Most of us use VLC to watch movies and not all of us finish them in one sitting. I hate having to find where I left off...

3
babebridou 3 days ago 3 replies      
Anyone else having issues with the delay in volume control? I'm used to the volume changing instantly whenever I change it with the mouse wheel, but in 2.00 there seems to be an annoying delay of about a second for each change, at least on my PC.
4
krig 3 days ago 1 reply      
I had to go in and delete the old version of VLC (1.12) that I had installed, and also delete the old preferences before the new version would work properly for me.

The new graphical look in OSX (perhaps other systems as well, I don't know about them) has gotten some critique, but I think it looks great.

5
mitchty 3 days ago 1 reply      
Ok, so first thing it did upon starting a movie in OSX was decide to rebuild the font cache.

Wasn't this a problem on windows that was fixed?

6
chrisballinger 3 days ago 3 replies      
Does the move to LGPLv2.1+ for libVLC, libVLCcore and libcompat allow for Applidium's VLC iPhone port (http://applidium.com/en/applications/vlc) to be resubmitted to the App Store after some minor changes?
7
adrianscott 2 days ago 1 reply      
First off, VLC is awesome, and VLC team are heroes, imho.

I ended up having to go back to 1.11, as it ended up choking on the 1080p60 files my vidcam produces, though at first, for a minute or two, it worked better than 1.11 win32 (where i have to slow it down to 67% speed to get smooth playback).

The u.i. changes (removal of slow down and speed back, and see playback speed) threw me for a moment, but then I saw I could customize the u.i. really easily, which was great.

Can't wait til the day it can tap into my Nvidia GPU (460), but i know that's a non-trivial problem.

Anyhow, overall, congrats to the team, and I look forward to some additional upgrades so I can tap into the new features.

-a

8
moonchrome 3 days ago  replies      
>New video outputs for Windows 7, Android, iOS and OS/2.

OS/2 ? Why ?

28
Apache releases first major new version of popular Web server in six years zdnet.com
288 points by thenextcorner  14 hours ago   84 comments top 5
1
jbarham 8 hours ago 4 replies      
It may sound trivial, but the thing I appreciate most about Nginx is its lightweight config file syntax.

It's very easy to glance over and see what's been set up compared to Apache's verbose pseudo-XML syntax which is about the worst syntax you can come up with: the verbosity of XML but without the benefit of being able to generate or parse it using standard XML tools!

2
Garbage 13 hours ago 3 replies      
Overview of new features in Apache HTTP Server 2.4

http://httpd.apache.org/docs/2.4/new_features_2_4.html

3
krmmalik 13 hours ago 1 reply      
I guess real world testing is going to be the best indicator of whether this is a worthy release or not, but i sure am very glad that Apache have at least attempted to up their game. Even if they are not able to deliver on their promise, the effort is noble enough - at least for now.

Personally, i'm very glad to see performance considerations being taken seriously, and even if nginx or node.js don't take over the world, its nice to see that they're forcing others to sit up and think.

4
xpose2000 12 hours ago 0 replies      
I'm excited. Even if it is only a 5% to 10% improvement in performance, then that buys me a little bit more headroom on my current server setup.

I look forward to testing it out down the road.

5
cmaxwell 14 hours ago  replies      
Expecting to see some meaningless benchmarks soon.
29
John Nash's Letter to the NSA agtb.wordpress.com
288 points by ckuehne  4 days ago   28 comments top 5
1
thebigshane 4 days ago 1 reply      
National Geographic recently had a "special" on "Inside the NSA: America's Cyber Secrets" where they mentioned and showed this letter. They said the NSA didn't end up doing anything with it but still wanted to classify it for 50+ years so that no one else could use the ideas within.

My favorite parts of the episode were:

- All of the Windows XP machines everyone was using

- The flashing red lights on the ceiling in secure areas (familiar for those that have been in similar secure facilities)

- The obnoxious re-enactments where real employees pretend to gather and discuss on-going developments. It was outright silly.

The episode just aired in January and it looks like it isn't on their site yet, but there are related videos: http://www.nationalgeographic.com/search/?search=%22Inside+t...

EDIT: The NSA press release mentions it too. They say "featured" but they didn't spend more than 5 minutes out of the hour program.

   The Nash letters were also recently featured on the National Geographic [...]

2
moonboots 4 days ago 3 replies      
I can't help but think that the NSA would have taken the letter more seriously if had been typed.

http://www.nsa.gov/public_info/_files/nash_letters/nash_lett...

3
ohashi 4 days ago 0 replies      
I am pretty sure you can see them at the cryptography museum right next to the NSA. I went two weeks ago and Nash letter is right in the front.
4
skrebbel 3 days ago 1 reply      
I wonder what letters / blog posts / emails are written now that will make us look back in awe in 60 years.
5
hsmyers 4 days ago 5 replies      
Tough call---part of me says that they blew it off while the rest of me says they put the information to work (as best as they were able) In any event I doubt they got back to him...
30
Please Steal These webOS Features ignorethecode.net
282 points by ugh  12 hours ago   135 comments top 4
1
rogerbinns 3 minutes ago 0 replies      
Android also has a central place for your accounts, and there is an AccountManager API to get access to them. You'll find your Google, Skype, Facebook, linkedin etc in there (and apps can add their own.) http://developer.android.com/reference/android/accounts/Acco...

Since iOS doesn't have this, and many developers did their apps on iOS first, they ignore the functionality and ask you for credentials etc again. It is extremely annoying. I regularly contact app developers to point out they could do better. https://plus.google.com/110166527124367568225/posts/bz1pN3az...

I encourage other Android users to do the same. You should never have to re-enter credentials the system already knows about.

2
fragsworth 11 hours ago  replies      
I know lots of Apple fanatics are extreme minimalists but I really think iOS should also steal the back button from the Android. I very much dislike when one app causes another app to open, and then I have to press the home button and find the original app to go back to it. And if I'm lucky, it's still in the same state as before.
3
untog 11 hours ago 1 reply      
I never got to try WebOS, and I really wish I had (I know I can download an emulator, but you can't emulate day-to-day usage).

That said, I'm using a Windows Phone these days, and it's fantastic. No, really. The UI is amazing, and going back to my old Android phone feels incredibly clunky by comparison. I wouldn't recommend one just yet- there's still work to do. For one, third party apps can't interface with native apps- for instance, the Messaging app seamlessly combines SMS, Facebook and Live chat, but I can't hook in GChat. If/when they get that set up they'll have a very loyal customer in me.

4
firefoxman1 8 hours ago  replies      
>(On task switching) "Nothing else I've seen comes close."

So true. It just feels SO much more like multitasking than any other platform.

And notifications? I agree, amazing.

I wish the article had covered the Gesture Area. I know the Touchpad did away with it, but the Gesture Area made everything so fluid, easy and intuitive.

Another interface feature I LOVE is the swipe-to-delete. The super-hot iOS ToDo app "Clear" has that exact behavior and people love it.

And how about TouchStone charging? Sooo nice. Pretty sure Palm was the first mainstream phone maker to have this functionality built-in.

One more feature that was amazing was the ability to bump-to-sync with other newer webOS devices. It's like the popular app Bump on steroids.

...And of course we all know how the apps were HTML/JS which was a brilliant idea. Why make everyone learn a new language just to write apps?

       cached 22 February 2012 05:11:01 GMT