hacker news with inline top comments    .. more ..    27 Dec 2010 News
home   ask   best   8 years ago   
Show HN: Radi, my HTML5 content creation app (Mac) radiapp.com
195 points by pavlov 8 hours ago   26 comments top 9
44 points by pavlov 8 hours ago 6 replies      
Radi is a side project of mine that has kept me occupied for the better part of 2010. I'm very happy to finally have it in a half-presentable state, just in time for 2011! What's that they say about the first release -- "if you're not embarrassed, you waited too long?" Well, this is that one for me :)

It's scary to put this app out there to the whole world in its current condition, but I feel like I've reached a dead-end.

There are too many directions in which this app could be developed. I need to cut down on the abstract ambition and start focusing on some real-world use cases.

For that, I desperately need your help! What's your biggest need in the world of HTML5 content creation? Do you see a path for this app to be developed into the tool that would fit your needs?

Thanks for any input, and (slightly prematurely) a great 2011 for everyone on HN! I appreciate this community enormously.


A few words about the technical details of this app...

Although the app targets HTML publishing, the compositing core doesn't actually use any kind of browser engine like WebKit. Instead, it's a node-based GPU-accelerated compositing system of my own devising. Layers and effects are rendered using OpenGL.

Because there's no browser engine, there's also no browser-style DOM available in Radi. I've written private implementations of the relevant JavaScript APIs, primarily Canvas. The JavaScript engine used is WebKit's JavaScriptCore, a.k.a. Nitro in its current accelerated incarnation.

The app is native Cocoa, but it's written in a way that's fairly portable. (From my other projects I already have a large part of the core ported to Windows+Direct3D.)

14 points by davidu 7 hours ago 1 reply      
Congratulations, you are about to be offered a lot of money from a big company.

Do you take it? ... Or do you create a new software empire for the creative web development world to rival Adobe but built on open standards?

12 points by robterrell 7 hours ago 1 reply      
First of all, when they come and offer you a pile of money (and they will), please insist on a clause that they release it. Someone may try to bury it.

Anyway, I had fun playing with it. The timeline gives me happy/terrifying Director/Flash flashbacks. I can't figure out how to make an animation cycle in the timeline (i.e. like a walk animation) without cheating & creating keyframes by moving stuff to a slightly different (x,y).

I also couldn't figure out how to script top-level elements (does everything need to be composited inside a cavas?). I added a number of "picture layer" objects that I wanted to move via keyboard control. I couldn't figure out how to put a script in the topmost layer to do this (nor could I see how to reference the sub-layers so that I could move them). I might have been fooling myself -- it looks so much like Flash or Director, that I expected to be able to write a top-level script that controls all of the sub-elements. If that's not possible due to the overall layer architecture, you should make that clear (by making the other scripting opportunities explicit when available).

A few quick suggestions:

- The "command line" input in the script editor is a great idea, like the "message window" of HyperCard or FaceSpan. I think you'd do well to make it its own window, and have the last result displayed underneath the input area (ala FaceSpan) so I don't have to wrap everything in sys.trace.

- Positioning elements via arrow keystrokes doesn't seem to work -- please implement bindings similar to Interface Builder (or FaceSpan) where modifier do useful things like snap to grid and stretch.

- JavaScript API reference menu item doesn't seem to work. I found it eventually (http://lacquer.fi/developer/conduit-js/#sys in case anyone else has the same problem).

Overall, it looks great. I can't wait to hard to believe you kept it quiet long enough to do all of this. Congratulations!

2 points by snprbob86 3 hours ago 0 replies      
Very cool! Nice job.

I noticed this: "Radi makes creating vector graphics easier with its unique autosmoothed shapes. When you apply smoothing to a corner point, all your control points lie precisely on the shape's outline. There are no separate "handles" or "tangent points" to be tweaked outside of the actual shape."

Just wanted to point out that, to me, that sounds like an anti-feature. Maybe it's a nice alternative tool for quick, smooth shapes, but I've watched many an illustrator tweak the hell out of those things to get exactly the right curves and sharp angels. I also remember reading a story about how Microsoft spent weeks observing video of expert Adobe suite users with various Pen-tools to get the feel of the handles juuust right for an easy transition to Expression suite products.

This is a cool side project and letting HN hammer on it is a great start, but you're going to need some artists to hammer on it before you've got something really special.

6 points by barredo 7 hours ago 0 replies      
I guess we have here the Adobe Flash Studio for HTML5. Nice job!
4 points by Samuel_Michon 7 hours ago 1 reply      
This is the best Festivus present ever!
1 point by chewymetal 1 hour ago 0 replies      
I've been trying to asses the look of the sites I go to lately (mostly clean and simple ones). I like what you've done. Clean with just a touch of depth. I haven't consumed the content yet, but it looks tasty. I'm looking forward to trying the beta.
2 points by elvirs 7 hours ago 1 reply      
I wish all browsers become CSS3 and HTML5 compliant overnight.
That would make the web a way better place.
1 point by tomdeal 2 hours ago 0 replies      
thats really amazing!
4Chan user survey results. Not many surprises. google.com
62 points by steveklabnik 4 hours ago   39 comments top 12
3 points by pmorici 36 minutes ago 1 reply      
65 people "discovered 4chan" in 2011... why was that even a choice, and what makes people think that a group of people who take nothing seriously would take a survey seriously?
5 points by itistoday 2 hours ago 0 replies      
For the genres/fetishes, it would've been interesting to see how those differ between male/female survey participants.
3 points by pyre 3 hours ago 1 reply      
Heh. The graphic for the question: "What boards do you most typically lurk?" is broken. Attempting to open it in a new tab results with the following error message:

  Bad Request
Your client has issued a malformed or illegal request.
The parameter 'chs=345x1000' specifies a chart with
345000 pixels, which exceeds the maximum size of
300000 pixels.

Is this a bug in the doc? Or a bug in Google Docs? It would seem to be a bug in Google Docs if the size of the chart was auto-generated based on the data, but exceeded the limits of Google Docs chart generation. Especially since on the resultant page it just looks like a broken image with no feedback to the user.

10 points by Anonomos 3 hours ago 1 reply      
I monitored in live a similar poll that ran in /b/ around two years ago, maybe earlier.

There were several questions about mental health (medically diagnosed pathologies, suspected pathologies, ...).

The sex ratio was around 10 to 1.

All girls reported moderate to severe mental issues, whereas only 1 dude in 10 did.


3 points by Helianthus16 3 hours ago 0 replies      
Man, that makes me feel all warm and nostalgic. Of course everyone's young, learning about the freedom of the internet and using their creativity for the first time. In my day it was LUE on the GameFAQs forum, though I was more of a Forumite at the time...
2 points by olalonde 1 hour ago 0 replies      
3% of the people who completed the survey don't speak English. Just a friendly reminder to take this and any survey with a grain of salt.
1 point by icefox 1 hour ago 2 replies      
>50% keeping a porn collection

I can only imaging what a teenager is able to amass these days with TB hard drives and what others might think when they go through it. From a security perspective if [they] suspect that [you] have a porn collection would it be prudent to keep a small and neutral porn collection somewhere easily found on their system? With the percentage being so high they (parents) might keep looking until they find something.

1 point by FooBarWidget 2 hours ago 1 reply      
4% report themselves as gay/lesbian while 19% as bisexual. I don't believe any of this. Is claiming to be bisexual a new fashion trend or something? Assuming it is a fashion trend, I don't understand where the trend came from; most 4chan users are male, men tend to be homophobic and if a man is bisexual then he also sleeps with other men. This doesn't make sense.
2 points by norswap 3 hours ago 1 reply      
There's 10% of criminals on 4chan. Boy I feel some people had fun with the survey...
2 points by cskau 3 hours ago 3 replies      
I think the most surprising info in there is the fact that there seems to be a rather normal amount of homo- and bisexuals. You'd think the often hateful language towards such would mean there'd be fewer than normal.
3 points by wizawuza 3 hours ago 4 replies      
Maybe it's just me, but the link doesn't seem to work. Just shows "Summary"
2 points by elvirs 3 hours ago 1 reply      
that website hosts a crowd of interesting (in a way) people O_O
The $85 Smartphone and the imminent extinction of non-smartphones asymco.com
9 points by barredo 59 minutes ago   7 comments top 3
5 points by earl 8 minutes ago 1 reply      
Am I the only person who loves my dumb phone?

For starters, it just works. A good friend has a Motorola Droid X. Android is a useless piece of shit -- his alarm clock keeps crashing with NPEs. This was one of the premier android phones when he bought it, and he left it completely stock. I, along with many friends of my generation, use our phones as our alarm clock and watch, so this is obviously pretty useless.

Second, my three year old motorola krzr is robust. It's been face down in my pants pocket while sliding down 500 feet of ice when I wiped out on some really icy moguls last year. That shattered the front. It's survived prolly 100+ impacts in my coat while snowboarding, though none as brutal as the above. The paint is peeling off the entire back. It's dented, prolly from a different fall snowboarding. It fell out of my coat pocket and hung out in the snow for 8 hours next to our car last year whence I fortunately saw it before driving off. Through all of it, it still makes / receives calls and makes / receives texts. I'm pretty sure I'd kill a smartphone inside 5 snowboarding days, tops. I still get 30 hours of standby battery life, though I'll probably have to replace the battery soon. My friends w/ iphones and the like seem to rush between charges, and last weekend at the ski house, there was a rush to find the one person who brought an iphone charger. I didn't charge all weekend.

tl;dr -- dumb phones are robust, have great battery life, and generally work well at their limited functionality

6 points by Zev 25 minutes ago 3 replies      
Just because smartphones have a higher profit margin for companies, doesn't automatically mean that consumers will buy a smartphone, let alone will want one.

Non-smartphones aren't going anywhere. Not until smartphones are given away for free and you can pick one up without being forced to add a data plan.

2 points by iworkforthem 11 minutes ago 0 replies      
I dun quite agree with the whole imminent extinction of non-smartphones, I think they will still coexist, with smartphone market share increasing gradually.

- Most part of the world are still fairly not developed to support the data requirements of smartphones, especially in Africa and Asia.

- There is still a need for non-camera/non-storage/non-smartphone in certain defense/government related jobs.

- A large part of the population is over 35 ages of years, most of them uses a non-smartphone, they use the phone just to make calls most of the time.

Day in the life of a Googler (Matt Welsh) matt-welsh.blogspot.com
52 points by siddhant 5 hours ago   22 comments top 7
20 points by mbm 3 hours ago 3 replies      
This reminds me of a quote I read from a lecture a few years back by a history prof at Colby:

"How long will you need to find your truest, most productive niche? This I cannot predict, for, sadly, access to a podium confers no gift of prophecy. But I can say that however long it takes, it will be time well spent. I am reminded of a friend from the early 1970s, Edward Witten. I liked Ed, but felt sorry for him, too, because, for all his potential, he lacked focus. He had been a history major in college, and a linguistics minor. On graduating, though, he concluded that, as rewarding as these fields had been, he was not really cut out to make a living at them. He decided that what he was really meant to do was study economics. And so, he applied to graduate school, and was accepted at the University of Wisconsin. And, after only a semester, he dropped out of the program. Not for him. So, history was out; linguistics, out; economics, out. What to do? This was a time of widespread political activism, and Ed became an aide to Senator George McGovern, then running for the presidency on an anti-war platform. He also wrote articles for political journals like the Nation and the New Republic. After some months, Ed realized that politics was not for him, because, in his words, it demanded qualities he did not have, foremost among them common sense. All right, then: history, linguistics, economics, politics, were all out as career choices. What to do? Ed suddenly realized that he was really suited to study mathematics. So he applied to graduate school, and was accepted at Princeton. I met him midway through his first year there--just after he had dropped out of the mathematics department. He realized, he said, that what he was really meant to do was study physics; he applied to the physics department, and was accepted.

I was happy for him. But I lamented all the false starts he had made, and how his career opportunities appeared to be passing him by. Many years later, in 1987, I was reading the New York Times magazine and saw a full-page picture akin to a mug shot, of a thin man with a large head staring out of thick glasses. It was Ed Witten! I was stunned. What was he doing in the Times magazine? Well, he was being profiled as the Einstein of his age, a pioneer of a revolution in physics called "String Theory." Colleagues at Harvard and Princeton, who marvelled at his use of bizarre mathematics to solve physics problems, claimed that his ideas, popularly called a "theory of everything," might at last explain the origins and nature of the cosmos. Ed said modestly of his theories that it was really much easier to solve problems when you analyzed them in at least ten dimensions. Perhaps. Much clearer to me was an observation Ed made that appeared near the end of this article: every one of us has talent; the great challenge in life is finding an outlet to express it. I thought, he has truly earned the right to say that. And I realized that, for all my earlier concerns that he had squandered his time, in fact his entire career path--the ventures in history, linguistics, economics, politics, math, as well as physics--had been rewarding: a time of hard work, self-discovery, and new insight into his potential based on growing experience."

8 points by grandalf 3 hours ago 1 reply      
I think it's too soon to tell how things will go at Google. He seems like a dopamine junkie (I can relate) so maybe after a few months he'll be checking HN and Engadget from Google as well.

For a very smart guy like Matt, chances are boredom will set in after a while... it will really be a test of Google to see if it can capture his imagination for 7 hours a day after he's worked there 6-9 months and all the novelty is gone.

8 points by dominostars 3 hours ago 1 reply      
This article reminds me of those weight loss infomercials:

- In 'fat' mode, the video color is bad, and the person is frowning, sad.

- In 'skinny' mode, the video color is clear, the person is smiling, and their complexion is better.

This gives the illusion that there's a dramatic change in the person's weight, when often they've only lost 10 pounds (which is great, just not dramatic). In the same way, there doesn't seem too big a change in Matt's schedule, there's just a shift in attitude. It probably wasn't a requirement to "Groan at the amount of work I have to do before the onslaught of meetings in the afternoon", or to "spend next 45 minutes reading Engadget, Hacker News, and Facebook". In the end, he had about 3 hours to work at Harvard, and did about 4 hours of work at Google.

It's interesting to dig into what's actually changed between jobs, because you might not completely know what you want in a work environment.

12 points by mattlong 4 hours ago 4 replies      
Sounds like the author likes hacking but loathes being a professor. Can't help but assume he's doing a huge disservice to his students by being so disinterested in teaching that he forgets about/doesn't even try to improve his lectures from the year before.
4 points by bhoung 1 hour ago 0 replies      
Interesting. But mostly concerned at the amount of soft drink consumed.
4 points by kunjaan 2 hours ago 0 replies      
When I glanced over the schedule he had at Harvard, I thought it was from PHDComics.
2 points by pmorici 3 hours ago 1 reply      
So now that he is at Google it looks like he is only really working from 9:00 to 4:00 (7 hrs.) at Google vs. over 8 hrs. as a prof, but in those 7 hours he is getting much more meaningful work done.
How Allies Used Math Against German Tanks wired.com
20 points by Luyt 2 hours ago   5 comments top 2
2 points by lucasjung 36 minutes ago 0 replies      
When I saw the headline, I my first thought was that the math was something along the lines of:

A = number of U.S./allied tanks in each battle

D = number of German/axis tanks in each battle

A >= 2D

4 points by bl4k 2 hours ago 1 reply      
or why to not put autoincrement id's into your webapp routes
The ghost of _why returns to teach people programming & Ruby hackety-hack.com
36 points by yarapavan 4 hours ago   11 comments top 3
9 points by steveklabnik 3 hours ago 1 reply      
Haha, I'm not a ghost, but if I can be half the coder _why was, I'll be happy.
3 points by stevejohnson 3 hours ago 1 reply      
1 point by puredemo 35 minutes ago 1 reply      
Didn't work on ubuntu. ;(

/tmp/selfgz13504/hacketyhack-bin: error while loading shared libraries: libssh2.so.1: cannot open shared object file: No such file or directory

How To Talk to Investors About Your Competitors bothsidesofthetable.com
7 points by Cmccann7 58 minutes ago   1 comment top
1 point by lionhearted 4 minutes ago 0 replies      
Love this quote, a great reminder:

> Remember: being too early is the same as being wrong.

Man quits job, makes living suing e-mail spammers yahoo.com
22 points by iwh 3 hours ago   8 comments top 4
6 points by bediger 2 hours ago 2 replies      
Two things wrong with this article:

1. By not using the harshest terms possible to denounce spam and the thieves who spam, this article tends to promote spam as something we all "just hate" but have to get along with.

2. This article does not note that the Man who Quit his Job is actually something of a meta-parasite, as he does no economic activity. However, Balsam is a parasite on thieves who themselves produce no economic output, so he is indeed a meta-parasite. It would be worth connecting this with research in artificial life and with research in biology where parasites-of-parasites have evolved.

1 point by bretpiatt 29 minutes ago 0 replies      
IANAL but it sure seems by not fighting these early individual smaller cases the spammers are setting themselves up for a huge class action case in the future by establishing precedence.

Any lawyers here? How do no contest rulings in small claims apply to setting precedence?

It looks like a fair amount of the time he's settling out of court -- I also wonder if any of the companies settling have deep enough pockets to be worth organizing a large class?

1 point by iwwr 1 hour ago 1 reply      
Considering he is suing many small companies, who prefer to settle rather than pay for a lawsuit, it's not clear he is doing a net service. A serial litigator has an economy of scale in terms of lawsuits. All but the largest companies can afford to keep lawyers on staff or to hire them regularly.

So as much as I'd like to applaud people who fight spam, I think this guy just found a legal niche to mess with small and legitimate companies. It's like the crippled person going around suing shops for lack of legal amenities (ramps, ramp angles, parking, toilets).

1 point by grav1tas 1 hour ago 0 replies      
It's an interesting approach to spam. It's kind of like the way people dealt with the tobacco industry in the 90's (and maybe still now? I haven't kept up) by just suing them for damages, false advertising, misleading stuff, or whatever else.

I think that this is one of those less heard of positives of having a litigious society. Sometimes we can curb nasty behavior by making rules that allow people to sue violators into submission.

The progress bar illusion newscientist.com
88 points by dchest 9 hours ago   28 comments top 8
17 points by philwelch 5 hours ago 0 replies      
If a download or something seems to be taking too long, I'll actually put the point of my mouse cursor at the edge of the progress bar to see whether it's stuck or just moving really really slowly. I really don't like these illusions.
13 points by phsr 8 hours ago 3 replies      
Apple has been using the "left moving ripple" progress bar for a while, which makes it look like the progress bar is moving, even when it's not
4 points by miguelpais 7 hours ago 1 reply      
The biggest illusion of progress bars is when they don't accurately represent the time left for something to complete. Like when you wait 7,5 minutes for the bar to reach 75% and then it suddenly jumps to 100%, instead of taking another 2,5 minutes to complete (that is frequent on installation processes).

Of course, when it comes to download progress bars it's not possible to make it accurate without making it bigger as the download speed drops and making it infinite/disable it when the speed is 0KB/s. But in other offline tasks the progress bar is frequently useless to capture time remaining for the completion of something.

That's probably why a time remaining label is added to them.

2 points by shalmanese 2 hours ago 0 replies      
Rethinking the Progress Bar: http://www.chrisharrison.net/projects/progressbars/index.htm...

A paper that investigates how non-linear progress rates also affect perceived completion time.

1 point by lkozma 3 hours ago 0 replies      
Interesting illusion, but I wonder what is the desirable effect here.

I can imagine it being argued both ways:

1. the bar should be animated so that it looks faster than it really is.

2. the bar should be animated in the reverse way, so that download is actually faster than it seems from the bar, thereby giving a positive surprise to the user.

4 points by mfukar 5 hours ago 1 reply      
Despite the interesting content, I'm compelled to ask: does anyone actually watch progress bars?
2 points by erreon 7 hours ago 0 replies      
The brain is a tricky thing. Even seeing this video I'm sure that the "left moving ripple" progress bar still seems faster to us.

Can something like the be applied to the loading animations in webapps? If the circle pulses or slows then speeds up randomly will it seem like the app is working faster or slower for the user?

1 point by some1else 4 hours ago 0 replies      
How about the illusion of aging 11% faster due to the spinning rainbow ball that Apple has also been using?
Git and Mercurial branching pocoo.org
42 points by steveklabnik 6 hours ago   20 comments top 7
8 points by cookiecaper 4 hours ago 1 reply      
I prefer git over hg because of its branch model. hg's doesn't compete, and git has totally changed my workflow with branches. hg seems like it would cause me to revert to the old way which is not desirable to me.

hg also requires extensions to do many things that git does out of the box (like the equivalent of git stash, which I also use often). I saw very little reason to prefer hg for the brief time I experimented with it.

7 points by rue 5 hours ago 1 reply      
An interesting article, if a little incoherent. It seemed to bounce around a bit much.

There are some reasonable means of local git sharing, by the by: the old gitjour project was fun and more recently there's gitosis and gitolite. You could also leverage Dropbox or similar, or actually work with individual repositories.

4 points by ErrantX 3 hours ago 0 replies      
Gah, I kinda came to this hoping to stick up for hg yet again. But a lot of these gripes are accurate.

It's not likely to change though. And here's the dice:

If you want something that "just works" and is awesome from the command line Mercurial is your best friend. If you may be facing borderline case that will wrap you in tape, Git is a better option.

6 points by rlpb 5 hours ago 1 reply      
Git doesn't really have changesets as first order objects; it has snapshots. Each commit is the hash of a snapshot of the tree at that point, together with the commit message and the hash of any (at least one) parent commits. This is pretty much the entire data model. The rest just pops out, which is why it is so elegant.
2 points by stevelosh 2 hours ago 1 reply      
I really need to sit down and write a blog post and set of aliases that let you use git-style branching in Mercurial through the bookmarks extension. It's completely possible, it's just not the standard.

The only thing I'm aware of that Git can do that Mercurial can't, branching-wise, is pushing deletes, which messes up everyone else's pulls anyway.

2 points by fendrak 4 hours ago 2 replies      
Am I the only one who thinks picking a DVCS tool based on the GUI is a bad idea? It seems to me that hg and git were designed to be used from a command line, and (for git at least) it works brilliantly. The GUI DVCS frontends try to hide a lot of the complexity of various tasks from you, but in that insulation you also end up with a clouded view of how things actually work, leading to the branching/merging mistakes the author talks about.
1 point by wkornewald 2 hours ago 0 replies      
So hg branches should behave like completely separate clones/repos that happen to live within the same folder/URL. That would fix pretty much all issues and be completely straight-forward since it mimics the separate-clones-as-branches model originally promoted by hg, with the difference that you don't have separate folders for the clones.
Inflated Tech Valuations? Blame Uncle Sam gigaom.com
4 points by meadhikari 43 minutes ago   discuss
UNIX/Linux Sysadmin Tutorials thegeekstuff.com
56 points by macco 7 hours ago   6 comments top 3
3 points by zppx 6 hours ago 1 reply      
30 - For Linux users, learn how to use ip (and the tools included in iproute2) instead of using ifconfig, which is part of net-tools, they are deprecated.

A good resource for that: http://andys.org.uk/bits/2010/02/24/iproute2-life-after-ifco...

45 - Use LVM always if it's possible.

2 points by juddlyon 1 hour ago 0 replies      
#51: Bookmark nixCraft, articles.slicehost.com & the Linode Library.
-1 point by wooptoo 6 hours ago 1 reply      
Backing up with `dd` is stupid.
25 Even More Slick Linux Commands viewtext.org
13 points by yarapavan 3 hours ago   5 comments top 3
4 points by AdamGibbins 1 hour ago 1 reply      
I really wish these blogs would stop replacing " with “ and ' with '.

I know it looks nicer, but it totally breaks the ability to paste into a shell :(

2 points by japaget 1 hour ago 0 replies      
The list stops partway through the 18th command on Firefox 3.6.13, IE 8, Safari 5.03, Google Chrome 8.0, or Opera 11.00. Use the "Original Link" in yarapavan's comment below to see the rest.
1 point by yarapavan 3 hours ago 1 reply      
Is Quora the biggest blogging innovation in 10 years? scobleizer.com
24 points by harscoat 4 hours ago   18 comments top 9
18 points by cletus 2 hours ago 4 replies      
Short answer: no.

Long answer: there is (IMHO) absolutely nothing innovative about Quora in terms of technology or concept. It's a Q&A site. No more, no less.

The only noteworthy thing about Quora is the marketing success in all the Valley insiders they've gotten to use it.

11 points by citricsquid 2 hours ago 0 replies      
> First, look at the Quora items I've been participating in. This is a lot like a blog. But it's not Dave Winer's blog style. It's any question I've followed, written in, voted up, etc

No, it's like a forum.

3 points by mmastrac 2 hours ago 0 replies      
It's not a blogging innovation - it's just a stream of activity that happens to be related to Q&A. It looks somewhat similar to someone's FriendFeed activity stream, or Twitter conversation stream.

Quora and StackOverflow might have improved the traditional Q&A site by making it more about conversations than just "HOW DO I GET BABBY" questions, but it's not blogging. Blogging is more about writing about what's on your mind. In some cases your blog entries are part of a conversation, but it's at the same level that a newspaper editorial is part of a conversation.

1 point by dannyr 14 minutes ago 0 replies      
Inside the bubble of Silicon Valley, maybe it is.

For the rest of the world, I don't think it is. I'm waiting for people outside the tech sector of the Bay Area to start using it.

3 points by kloncks 2 hours ago 0 replies      
Quora's relevance and absolute focus on completely revolutionizing the Q&A sector is fascinating. They've done so many innovations in that space.

Counting that as an innovation to blogging seems too far-fetched...

10 points by vyrotek 3 hours ago 2 replies      
No. But while we're here, they need to fix their search.
3 points by AndrewDucker 2 hours ago 1 reply      
What differentiates Quora over Stack Overflow?
2 points by minalecs 2 hours ago 0 replies      
is scoble an investor in quora?
1 point by chunkbot 47 minutes ago 0 replies      
Lisa Zhang, Facebook Data Intern: Things I learned lisazhang.ca
120 points by wslh 13 hours ago   26 comments top 7
33 points by huangm 8 hours ago 2 replies      
I have to wonder if a lot of the people criticizing this blog post actually read the whole thing.

Note the title: Things I learned. She's posting about her internship experience and the lessons she learned. She's not trying to give nuanced, universally applicable advice for how to become famous in every circumstance and industry.

She's talking about "fame" from a specific perspective -- that of a student who might be intimidated by the success of someone like Zuckerberg. She relates the example of Paul Butler and his "fame". It's clear he's not "famous" in the conventional sense, but that he simply created something that people in the industry liked enough to tweet and email around. She realizes that there may not be anything intrinsically different about the people who attract "fame", and that her having cut Zuckerberg in the lunch line makes him just another regular person who lines up for lunch and gets cut in line. The reason Zuck's famous is that he did something that people liked (and now he's arguably conventionally "famous" because peopled liked it at scale).

I found this to be a simple and useful reminder that our heroes are human too, and all of them got where they are by doing stuff.

There's an almost pathological compulsion here to analyze every word and sentence, hoping for some incisive way to show one's cleverness. I think that's missing the point.

12 points by chime 9 hours ago 4 replies      
> "Famous people are famous because they do things. There's nothing more to it, and nothing less."

I agree with the "they do things" part but wholeheartedly disagree with "There's nothing more to it." There's a LOT more to it. I have a really good friend who ran a social network between 2003-2006 and had millions of users. He is a brilliant coder and worked really really hard to grow the site. He "did things".

He had a vision, built the foundations himself, assembled a capable team, delegated appropriately, and oversaw the operations as needed. But he's not famous today. Why not? Because his social network didn't make it big. It was successful enough to give him a good middle-class lifestyle (better than ramen profitable) but it didn't make him millions. And today nobody knows of his brilliance or leadership skills. You could say he's not famous because he didn't push through harder but then you better hold that criteria for everyone else who is famous too.

16 points by alnayyir 11 hours ago 2 replies      
Money quote:

" I realized that their ticket to fame is actually really simple: when they had an idea, they followed through.

Simply put, they did things. They executed."

Now we just need to make that stick.

6 points by chegra 11 hours ago 0 replies      
The money quote for me:

"I did some things I'm proud of and a few that I'm not. I failed a lot, but learned a lot too. Hopefully, in 2011, I won't let trivial fears set me back: I'll do more, try more, and say "yes" more. There's just too much to lose otherwise."

Fortis cadere, cedere non potest(The brave may fall, but never yield.)

8 points by chrisbroadfoot 11 hours ago 1 reply      
Donald Knuth's Christmas Tree Lecture 2010:
3 points by paraschopra 7 hours ago 0 replies      
For a well-reasoned counter argument, may I recommend the excellent Fooled by Randomness? "Doing things" is necessary but not sufficient condition to becoming great.
1 point by Anon84 4 hours ago 0 replies      

     1) Famous people are famous because they do things. There's nothing more to it, 
and nothing less.
3) When you decide to do things, opportunities come.

These two always bear some reitaration.

Mojolicious 1.0 released - A new Web Framework for Perl kraih.com
66 points by kraih 9 hours ago   20 comments top 9
8 points by btilly 8 hours ago 0 replies      
As the article says, Mojolicious was funded in part by a grant from The Perl Foundation.

If I remember correctly, the vote for it was almost unanimous. (I'm on the grant committee.)

7 points by carlhu 8 hours ago 0 replies      
For a web engineer, http://mojolicio.us/ is as hard-hitting a brochure page as I've seen in some time. Built-in long-polling combined with full-stack-style templates and simple views combined with that one line install. Good luck and thank you for this contribution!
1 point by chuhnk 7 hours ago 1 reply      
Highly intriguing. Perl is a solid language that has really held firm ground for scripting but has been over taken by the likes of ruby in the web. Its nice to see the emergence of an mvc framework to help in its popularity and further growth. I for one will definitely be trying this and comparing with my current implementations in ruby.
4 points by kraih 9 hours ago 0 replies      
For more see also the website at http://mojolicio.us
1 point by sherr 3 hours ago 0 replies      
I've been looking at Perl frameworks recently, and Dancer in particular. Dancer looks good to me - but so does this. I'd love to see it compared and contrasted to Dancer and Catalyst. Great work!
1 point by natch 9 hours ago 1 reply      
How well has the installation process been tested for userland-only (non root) installs?
1 point by sigzero 8 hours ago 0 replies      
Very awesome!
1 point by marcusramberg 9 hours ago 0 replies      
1 point by yko 9 hours ago 0 replies      
The Unwelcome Return of Platform Dependencies techcrunch.com
32 points by dave1619 6 hours ago   7 comments top 3
2 points by bl4k 4 hours ago 0 replies      
There are some interesting points, but the wrong conclusion.

The conclusions and lead-in should have been that as with the desktop market, on the web do not trust the platform providers.

2 points by SupremumLimit 5 hours ago 1 reply      
There is zero insight in this article as far as I can see. Yes, if you are using someone else's API, it might change. Yes, if you aren't complying with the terms of service, you may have a problem. If you are adding features to someone else's product, then sure, they can implement those features too and make you obsolete.

In addition, I don't see what any of this has to do with the initial premise that the web provided a way of avoiding the complexities of cross-platform development.

1 point by gordonguthrie 5 hours ago 0 replies      
People really should read Michael Porter's Competitive Strategy:


He points out that pricing power in a supplier relationship is held by whichever side would find it easier to integrate into the other ones space. So Apple vs App Makers, Facebook vs in-Facebook games, Twitter vs Twitter apps...

There can be a healthy acquisition route for successful startups in a supplier relationship - but only for the first one of its kind.

The “thin edge of the wedge” strategy cdixon.org
4 points by revorad 1 hour ago   discuss
Happy Birthday to the Suit economist.com
36 points by Andrew_Quentin 7 hours ago   19 comments top 8
17 points by johnohara 4 hours ago 0 replies      
As a younger man, I worked in the concrete construction business for exactly 10 hours.

In the morning, we unloaded and placed an entire truckload of foundation forms by hand. In the afternoon, we dug out the ground for a new garage pour, by hand. Then, a full truckload of gravel showed up, and we spread it, by hand.

I fell asleep in the truck on the way back to the yard.

The next day, the owner approached me, gave me $100 dollars cash, and thanked me for my service. That was it. Career over.

I spent the $100 on a new suit. I didn't know what I was going to do for a job, but I knew I wanted that suit as my work clothes.

9 points by Mz 6 hours ago 1 reply      
I love fashion history. There is so much social history tied up in it.


Savile Row was inhabited largely by surgeons before the tailors moved in during the 19th century, and their influence can be seen in the “surgeon's cuff”. On the most expensive suits the cuff buttons, which mirror the pips of military rank, can be undone, allowing the sleeve to be rolled back. This let surgeons attend patients spouting blood without removing their coats"an important distinction that set them apart from shirt-sleeved tradesmen of the lower orders. Surgeon's shirts, with detachable cuffs, are still made to order by London tailors.


Colours and cuts come"the fashion a decade ago was for four-button jackets"and go. Yet the modern world has transformed the suit's interior. Pockets for train and bus tickets appeared with the commuter. Pen pockets and pockets for mobile phones have followed. Mr Munday has fielded inquiries about internal pockets to hold an iPad. No problem, he says. They are not so very different to the large “hare” pockets on the inside of field coats worn by country gents that will hold birds and rabbits felled with a shotgun.

6 points by j2d2j2d2 3 hours ago 1 reply      
I used to work for Bear Stearns. I was thinking of quitting before the crisis hit but had to stay and watch after the day our stock value dropped to $2 a share.

By the time I left I really hated working there. I tell some of the story here: http://j2labs.net/blog/2010/jan/11/economies-pools-and-piss/

Anyway, when I moved out of the apartment I had, near Wall St, I threw out my suits. I told myself I'd never take a job that required a suit. The suit felt like a hollow nod between business men that the appearance of success was to be regarded just as highly as the ability to succeed.

I'm less hostile to suits today, but I much prefer to do business with the types of people that also don't like wearing them.

11 points by Umalu 5 hours ago 5 replies      
Interesting history of the suit. But while the article sees a future for the suit, to me the suit is history. Suits once denoted upper class, and sumptuary laws prevented lower classes from looking upper. Now suits denote lower middle management in an old economy company. No one aspires to that. Jeans, sweats and t-shirts are the new class separator, indicating which of us have earned the freedom from having to wear suits.
3 points by CallMeV 2 hours ago 0 replies      
I watched a TV documentary a few months back, which took a look inside the various tailors' establishments of Savile Row. Consider this: back then, these shops were the startups and entrepreneurs of their day, coming to the street with a few bolts of cloth and next to no money in their hands, spotting the needs of the customers and moving in to sell them a product specifically suited to their needs.

The age of many of these stores speaks volumes of their capacity to read the market over the decades. Anyone who's ever set up a business and wanted it to last could do well to look at these businesses' example.

6 points by andreyf 4 hours ago 1 reply      
1 point by bediger 3 hours ago 0 replies      
Another PR Hit for Men's Wearhouse, eh?
1 point by edge17 6 hours ago 0 replies      
that was enlightening
Sikuli (automated GUI testing tool) releases v. 1.0 RC1 sikuli.org
15 points by japaget 4 hours ago   3 comments top 2
1 point by bajsejohannes 2 hours ago 0 replies      
0 points by mike4u2 2 hours ago 1 reply      
A Year Of Scala joa-ebert.com
38 points by DanielRibeiro 8 hours ago   25 comments top 6
6 points by code_duck 7 hours ago 1 reply      
Great article. I've been learning Scala for a while now as time allows, and have had similar experiences - the syntax is tricky at first, and much of the community focuses on CS esoterica that isn't familiar to me, but the major concepts are quite familiar, coming from Python, Scheme and JavaScript. I'm still looking forward to my moment of epiphany when I realize I've become vaguely competent with Scala.
4 points by Homunculiheaded 5 hours ago 3 replies      
As someone who has spend a fair amount of time with Haskell, Scheme, Common Lisp etc, can someone give me some insight into what's really exciting about Scala? There seem to be a lot of people I respect using it, but most of the articles I've read are of the form "coming from Java, Scala is awesome!" which I haven't found particularly enticing. Every time I look at examples I don't really see anything that sticks out as particularly interesting. I really want to get excited about Scala, and would really appreciate any insight that would help me with this.
1 point by Luyt 1 hour ago 0 replies      
This article demonstrates the beauty of taking away cruft until you only have the essential stuff left. I like Scala's succinct way of anonymous function notation.
3 points by sharednothing 6 hours ago 1 reply      
"I was tired of writing code like this over and over again."

This is getting tiring:

   public class MyGraphThingee extends Graph<Foo, Bar>

1 point by mike4u2 1 hour ago 1 reply      
I did a few spare time projects with Scala, but does (did) anyone use it for commercial applications? If so, what kind?
The Smartphone Explosion avc.com
39 points by harscoat 8 hours ago   35 comments top 8
9 points by replicatorblog 7 hours ago 2 replies      
I think one of the things that even brilliant guys like Fred underestimate is that Apple offers an end-to-end ecosystem that people aspire to join while Android is the default choice for smart phone carriers that don't offer the iPhone. Android will likely have greater market share, but will it be the portion of the market you want to build for?

My view might be clouded because I work in an industry where 45% of revenue comes from 5% of customers. Will the bulk of dollars available from smartphone/tablet customers come from the market share leader or the profit share leader? My guess is the latter.

Even if Android wins in smartphones, I'd bet iOS will continue to own music, tablets, and TV and TV could be bigger than the rest combined.

3 points by martythemaniak 7 hours ago 1 reply      
Not many people have caught onto the fact, but Android has already exploded massively. It's a moving target, but during the same time that RIM, Apple and MS shipped their flagship products (iPhone4, BB Torch, WP7), Android outsold them all combined. With Nokia mired in Symbian-MeeGo nowhereland, there doesn't seem to be anything on the horizon to slow down that momentum.

Despite all the hand wringing over "fragmentation", the fact is that Android scales and adapts wonderfully and this will be the basis for its dominance. Consider the new wave of cheap Android smartphones coming - with what other OS could a middle-aged Indian man upgrading from an old crummy nokia be just as satisfied as me - a geeky dev who loves the latest and greatest and has gone through a dozen smartphones?

2 points by mgkimsal 5 hours ago 0 replies      
slight prediction here: apple may introduce a 3g plan for a next generation of iPod Touch, strongarm AT&T or Verizon in to offering a decent unlimited data plan, and start pushing facetime (and possibly skype) over traditional 'phone' functionality altogether. Yeah, why would a carrier cannibalize their own voice network? Perhaps they won't, but they might for the right amount of cash (isn't it always about cash?)
2 points by swombat 7 hours ago 3 replies      
Even on a geeky site like swombat.com, mobile traffic is still only 12% of the total. Android has a long way to go. I doubt it will achieve all that in 2011.

Just because everyone can access the web on their mobile doesn't mean it will become their main channel to it. I have mobile devices too, but 99% of my web browsing is still on my laptop, and until/unless I need to be super-mobile all the time, that will remain the case.

2 points by dave1619 6 hours ago 1 reply      
I agree with FW that 2011 will see a huge explosion in smartphones. 2010 was already a huge boom. But the interesting potential lies in the billions of people in the developing world, can they be enticed to join the smartphone world with cheap smartphones priced under $100? I think they can. Virgin Mobile has an Android device for under $200, no contract and only $25/month for unlimited data and 300 minutes. That's enticing, and it's probably just going to get cheaper. Android has the price point where it can lure people in to a smartphone from a regular phone, and it has the platform where it can keep most of the people. The big question to me is what is Apple's response going to be? Is Apple just going to sit there and let Android take the low-end market, or is Apple going to proactively respond and release a lower end phone, ie., iPhone Nano, that can compete with the low-end Android phones? The same thing in tablets, will Apple release a 7" tablet to compete with the low-end Android tablets?
4 points by code_duck 7 hours ago 3 replies      
Fred Wilson does very well stating and investing in the obvious.
2 points by fonosip 4 hours ago 0 replies      
good points. yet apple has options. the iPad was priced very aggressively out of the gate, to the point of having no real competitor even now.

the iPhone pricing is muddled by the carrier subscription costs. which incidentally are the best/most profitable large business in the world right now.

the iPod touch has no competition even after years in the market.

my money for 2011 is that apple continues to be the money/profit leader. not android

0 points by joshu 1 hour ago 0 replies      
I can't wait for Dell to get into the smartphone business.
File synchronisation retout.co.uk
9 points by kgarten 3 hours ago   11 comments top 2
1 point by mrb 19 minutes ago 0 replies      
The solution is quite simple: use Dropbox to synchronize a single binary file that is a LUKS-encrypted Linux filesystem (or TrueCrypt or whatever).

This meets all the requirements this person is asking for. All his systems could even be mounting the image using different passphrases (IIRC LUKS allows up to 8 different passphrase for the same image). It will work well because Dropbox will of course synchronize the deltas and not transmit the whole image every time a single byte changes.

Only downside is you can only have 1 client mounting the image at a time... Perhaps some clustered filesystem could get rid of this limitation (that would be an interesting use of them).

4 points by gnosis 2 hours ago 2 replies      
"If anyone mentions Dropbox to me one more time, I will scream. I'm sure it's a wonderful solution, but I have deep misgivings about handing my data over to someone I don't trust."

It's really disturbing how eagerly most people are willing to hand over personal information to corporations; whether it's data to the likes of Dropbox, movie viewing preferences to Netflix, book reading preferences to Amazon, all of one's email conversations to Google, and a list of who one's friends are to Facebook.

I could go on and on listing information that the vast majority of people give away about themselves without a second thought to get some "free" service.

And the saddest thing is that few people care, or even think about what they're doing.

Hopefully, as the general public becomes more computer literate, and groups like the EFF and ACLU educate them on the privacy implications of these technologies, their attitudes might change for the better.

New Statistical Law Discovered scientificcomputing.com
26 points by J3L2404 6 hours ago   9 comments top 4
18 points by jforman 5 hours ago 3 replies      
Just in time to vie for "most overhyped scientific headline of 2010."

The paper, "Bimodal gene expression in noncooperative regulatory systems" (available at http://www.pnas.org/content/107/51/22096.full), addresses the question of how bacterial populations achieve bimodal phenotypic distributions (basically, 1 blue cell goes in, 5 blue cells and 5 pink cells come out in a stable ratio, wtf).

This phenomenon has previously been explained by a) cooperativity or b) feedback loops. Cooperativity is a phenomenon in molecular biology where the binding of one protein to, say, DNA will increase the rate of binding of a partner in crime (which can, say, turn a gene on) -- this is a relatively simple regulatory scheme that allows proteins that respond in linear fashion to their environment to produce a non-linear output. Feedback loops are pretty self-explanatory (one thing worth mentioning is that if the feedback loop involves the transcription of a gene, then it is much slower-acting than cooperativity).

Cooperativity can explain the bimodal distribution problem because cooperative systems can act as "regulatory switches" where very small fluctuations in environment can result in very different behavior (see http://bioweb.wku.edu/courses/biol566/Images/MCB3-29hemoglob... for a simplistic example of hemoglobin response to oxygen). Thus, one blue cell goes in, 10 cells come out with very slight environmental differences that are on this response-boundary - voila, 5 blue cells, 5 pink cells.

Anyway, this paper shows that a cell population can have a bimodal phenotypic distribution without cooperativity, simply by having single proteins that respond in non-linear fashion to their environment. That's...a new statistical law?

I don't have the time to fully deconstruct this paper, but after a skim I'm left with the question, "so how does a single protein achieve non-linear response?" This paper seems to provide a mathematical model without giving a good explanation for how it would be achieved in nature. The only answer I can come up with is cooperativity within the protein itself - say, cooperative allosteric regulation. Which would make the theoretical contribution of this paper a lot less exciting.

1 point by gojomo 24 minutes ago 0 replies      
Bad headline, sure, but interesting result. It suggests even without signalling, and with identical nuclear DNA, cell division can result in daughter cells that each try a different side of a gene-regulation possibility space.

If the environment has changed subtly to advantage or disadvantage that gene expression " well, voila, half the offspring are a probe in the better direction. That's why the process reminds me of binary search.

And across the billions of sibling cells each trying different levels of each gene? Massively parallel search for the optimal levels of gene expression, without any mutation or genetic exchange. That's an exciting idea, to this layman.

5 points by HilbertSpace 4 hours ago 0 replies      
Commonly in natural processes, a 'concentration' is a 'random walk' with two 'absorbing' states, (1) concentration 0% and (2) concentration 100%.

A state is 'absorbing' if when once reach it, then never leave it.

So, the work is talking about concentrations of proteins or their mRNA varying due to thermal 'randomness'.

Maybe in this case, a decent first-cut mathematical model of what is going on is just a discrete time, discrete state space Markov process. So here each discrete 'step' in time is one generation of a cell. The 'states' record the concentrations. So, could use, say, some of the 'nonlinearity' effects to estimate the probability of moving in one step from one state (of concentrations) to another.

But the big, 'global' issue is that the concentrations just keep changing and stop changing only at the absorbing states of 0% or 100%.

For something with more detail than this 'global' view, if have those probabilities, then can use some standard work in such Markov processes to say how fast the concentrations reach the 'absorbing' states of 0% or 100%.

The literature on Markov processes is enormous. For a start, consider:

Erhan Çinlar, 'Introduction to Stochastic Processes', ISBN 0-13-498089-1, Prentice-Hall, Englewood Cliffs, NJ, 1975.

1 point by gojomo 4 hours ago 0 replies      
It's molecular biological binary search!
Telling Right From Wrong nytimes.com
16 points by J3L2404 5 hours ago   8 comments top 4
1 point by jessriedel 4 minutes ago 0 replies      
This is still a distinctly minority view among moral philosophers. I believe (I can't find the survey at the moment) that most think the is-ought gap is fundamental.
2 points by joe_the_user 3 hours ago 2 replies      
Building a plausible a-religious moral philosophy always reminded me of building a plausible perpetual motion machine.

How can you conceal the fact that you're trying to extract a conclusion from a system that doesn't have a premise? Well, there are indeed lots of methods of misdirection.

The last try at building a perpetual motion machine that I remember involved claiming to have a device for "amplifying power" but not being able to hook the device to itself "somehow". Made the system harder to disprove.

2 points by jackfoxy 4 hours ago 1 reply      
I have heard (but never confirmed for myself) that the term translated as sin into English from the Torah has more of the sense to miss the mark in the original Hebrew.

It appears from this summary this atheist has replaced traditional Western religious-based morality with the very same moral system, sans God.

0 points by Avshalom 1 hour ago 0 replies      
right: ->

wrong: 1=2

Intellectual Property Run Amok motherjones.com
8 points by mbesto 3 hours ago   3 comments top 3
1 point by quanticle 0 minutes ago 0 replies      
I liked the example of the Internationale. I mean, how hypocritical is it to assert property rights over the Communist anthem?
3 points by tzs 1 hour ago 0 replies      
More like "writer with a bunch of random out of context facts who has a deadline and can't think of anything to actually say so just throws the facts all out there in a list and hopes no one notices he didn't say anything" run amok.
1 point by pyre 2 hours ago 0 replies      
The George Foreman example is a bad one. How is getting paid to endorse a product an example of "Intellectual Property run amok?"
Should You Reset Your CSS? sixrevisions.com
22 points by kingsidharth 6 hours ago   10 comments top 7
9 points by cheald 5 hours ago 1 reply      
I've landed solidly in "always reset" territory. There are just so many inconsistencies between browsers/platforms that I'd rather have everything looking as consistent as possible, and work from there. It does add extra garbage to Firebug, but it's worth not having to debug a display issue on a browser on a platform that I don't have access to.
5 points by mikeryan 5 hours ago 0 replies      
My personal technique is to start with a slightly simplified version of Meyer's reset. Then as I get to final QA I comment out each reset selector and see what happens.

It kind of gives me a happy place between using them and not.

4 points by akamaka 3 hours ago 0 replies      
I don't use pre-built resets, partly because they didn't exist when I started doing web development, but mostly because I'm not pursuing "pixel perfection". Ironing out tiny cross-browser issues is the very last step for me, after having dealt with content, usability, accessibility, maintainability, and performance.
8 points by greghines 5 hours ago 0 replies      
I avoid CSS resets because it makes it more difficult to work with the styles in Firebug. Because most resets apply to every element, if you've selected an element five levels deep, you're going to see five copies of the reset in the style list. And if that element is inheriting from other non-reset styles up the tree, they're going to be buried in the middle of those extraneous copies of the reset. That's a lot of noise added to what should be a simple, straightforward list of the styles that apply to the element you've selected.
2 points by Pewpewarrows 4 hours ago 1 reply      
I use Richard Clark's CSS Reset paired with my custom-built scaffolding CSS stylesheet. It rips out all the base styles applied by browsers, and builds them back up so that absolutely everything: fonts, form elements, tables, code blocks, etc not only all look identical in every browser, but are easy on the eyes.

I find it pretty laughable that developers ignore Resets because of a little extra noise in their CSS debuggers. That's like advocating against using a framework for another language because your stack traces are a little longer by default. In Chrome's Dev Tools you can hover over any final computed style to determine exactly what file and line it came from, and the rules are shown most-specific first. I rarely if ever have to scroll down to the broader rules where the duplicated Reset noise tends to be more prominent. I haven't used Firebug in a year or two, but I'm assuming it has similar developer-friendly functionality.

2 points by juddlyon 1 hour ago 0 replies      
There are good arguments for both but I've tended to agree with Yahoo's approach to YUI, which includes a reset. Not because I've explored it in detail, but because I figure Y! has (and at a large scale).
1 point by schammy 8 minutes ago 0 replies      
The second I learned this trick, many years ago, I've used it in everything. The argument that it's "slow" is ridiculous. It's like using while(i--) instead of a for() to shave 0.0000001 milliseconds off your javascript loops. Just admit you have OCD and move on ;)
Unexpected Connections In Mathematics rjlipton.wordpress.com
28 points by bdr 8 hours ago   discuss
Using ZSH jbw.cc
34 points by preek 9 hours ago   7 comments top 3
1 point by ihodes 4 hours ago 1 reply      
This is pretty convincing for me"I've been holding off on switching from bash for years now.

The single thing keeping me back is that bash is the shell to use, and some scripts you need to setup xxx thing are written for bash, and trying to debug some small difference in behavior between zsh and bash is something I don't want to find myself doing.

Has anyone else not switched because of this, or switched after grappling with similar issues? What are your thoughts?

1 point by jbw 2 hours ago 0 replies      
Fixed a typo in the HTML for your viewing pleasure...
1 point by pythonrocks 7 hours ago 0 replies      
Bash's dead, baby. Viva zsh!
       cached 27 December 2010 01:59:01 GMT