hacker news with inline top comments    .. more ..    24 Aug 2011 Best
home   ask   best   8 years ago   
Twitter Bootstrap github.com
988 points by d0vs  4 days ago   106 comments top 36
tptacek 4 days ago 4 replies      
Not to take anything away from a really nice contribution by Twitter's team, but know that stuff like this in tens of different variations are available for tens of dollars at sites like ThemeForest.

A typical "admin" theme has some variant of the 960 grid, nice forms and buttons, drop-down navs, tabs, and accordions, and multiple layouts.

The templates won't be as well documented as this one, and they'll be more brittle and probably more poorly coded... but they'll only take an hour or two of additional work to (say) Hamlize and bring into your Rails project. They'll look more distinctive than Bootstrap. They'll have better browser compatibility.

Unless you're crazy enough to be selling web apps to web developers, no customer of yours is ever going to know or care which of these things you started out with.

Or, use Bootstrap; it's really nice. I'm just saying you have lots of good options.

akavlie 4 days ago 3 replies      
Any clue what the browser compatibility is like, and how well it holds together in older browsers?

All they say is "only modern browsers in mind" -- but I see no details about that.

If it holds together sans pretty effects that's fine.

Pewpewarrows 4 days ago 1 reply      
All it needs is for that grid to be responsive using CSS media queries, and that will probably be the last CSS toolkit I'll ever need. I have my own internal boilerplate one that I use, but it's nowhere near as refined.

Bravo to the Twitter dev(s) responsible for this. You probably just saved me hundreds of hours of future work and frustration.

joeshaw 4 days ago 1 reply      
What is the relationship between this and Bootstrap.less (http://www.markdotto.com/bootstrap/), other than it appears to be by the same person? What about Skeleton (http://www.getskeleton.com/)? Are they just 3 different visual approaches to the same problem, or is there an evolutionary progression among them?
wildmXranat 4 days ago 1 reply      
For a developer like me, that doesn't have a lot of time to sharpen UI skills, this is amazing.
akavlie 4 days ago 2 replies      
Looks pretty nice. But how did this come out of twitter? I haven't seen half of these elements on twitter.com.
bdesimone 4 days ago 2 replies      
Gorgeous as it is -- I just can't use something that utterly breaks IE7,8. I wish the realities of browser support were different.

Nice contribution in any case.

jhuckestein 4 days ago 4 replies      
This seem like a perfect addition to a hackathon toolkit :) I'll definitely use it.

Random rant: Why do they call it "Twitter bootstrap"? IMO that kind of association is a little vain. Rails isn't called 'Ruby on 37Signals'. Besides, it's slightly confusing because it seems to have little to do with Twitter.

liedra 2 days ago 0 replies      
It's a bit of a shame that the "just hotlink the css" thing doesn't actually work very well - it actually means that some parts won't function (such as the Dropdown menu). My partner and I have just spent the better part of half an hour trying to work out why this didn't work (the documentation didn't mention any javascript necessity).

Docs could use some serious clarification - which parts are CSS-only and which require what js files.

dfischer 4 days ago 0 replies      
This is absolutely interesting and amazing to see. There's a lot of good elements and style in here. I think this is a great effort and movement forward to help bring a "standard UI" library to the web. I think there's a good use case of websites starting to have a more "locked in" UI/Look & Feel. I'm not convinced by this notion but it's something I've been thinking about lately.

Either way, this is a great library for bootstrapping your web-app.

I do have some qualms though, but this is due to the limitation of CSS. You really should start practicing separation of content from presentation. It's great to see them leverage LESS but I'd like to see this integrated with SASS/Compass.

I'll probably convert it over and release the link on hacker news when ready.

* the stuff that's really cool is forms/modal/navbar/tips

seunghomattyang 4 days ago 5 replies      
I only recently started learning HTML and CSS so I was wondering if there is a good resource for learning how to use grids.

I don't exactly understand how to use grids (or even why). Is it for visual consistency or is there an underlying usability/maintainability benefit to using grids?

katieben 3 days ago 1 reply      
Just used it today with my new 1x52 project: http://jambx.com/ - Twitter for song lyrics.

I'm doing some sort of new web project once a week, for hopefully a year. Writeup on my blog about it: http://kguac.com/2011/08/1-x-52-week-4-jambx-twitter-for-lyr...

Adkron 4 days ago 1 reply      
Classes and ids should describe the data that you are marking up, and not the structure of the view. It is really hard to continue to work with something like this. It increases the number of classes on each element and causes the developer and designer to fight over which classes go where. Now we have separate classes for style and functionality.

If we describe the data then it is easy to describe the view of that data in different ways, and with the same html.

leek 4 days ago 1 reply      
This is great, but I'm afraid I will be forking and adding SASS/SCSS support.
SudarshanP 3 days ago 0 replies      
It would be awesome if there was a forked Django/Rails/Sproutcore framework with tight integration with something like Twitter Bootstrap.

That way you would have a super awesome looking app right from the start.

The pros can work on the lean frameworks while the lazy ones like me can hop onto the shoulders of the giants ;-)!

pbreit 4 days ago 1 reply      
I've been looking at Blueprint and Skeleton but this looks like a better option.

If I'm not a LESS user am I going to feel second class?

artursapek 4 days ago 2 replies      
Wow nice, great to see them writing off outdated browsers.
ricardobeat 4 days ago 1 reply      
Looks very well made, I'm using it next week already for a small project :)

The IE incompatibility is a shame, this could be a nice poster for progressive enhancement.

southpolesteve 4 days ago 2 replies      
Would love this if it used SASS
jemeshsu 4 days ago 0 replies      
Is the site http://twitter.github.com/bootstrap/ using Bootstrap? When view on iPhone, there is a thin white margin for the right border. This white border stays even if you double tap on the screen, which normally means the site will expand to fill.
Tichy 3 days ago 0 replies      
What I don't get is why browsers don't come with good looking typography settings by default.
kbj 3 days ago 0 replies      
I tried integrating this with the html5-boilerplate project as that might improve the cross-browser compatibility. It seems doable. Most tweaks seems to be related to the css reset part as they differ a lot. (Newest h5bp uses normalize.css vs Eric Meyer CSS reset.)
voidfiles 4 days ago 1 reply      
Why less, why not sass?
ubi 3 days ago 0 replies      
As a solid developer who lacks real design skill I thank you guys for building this. Having a clean looking app is key when showing off new ideas.
mark242 4 days ago 0 replies      
This is fantastic. Google created a stylesheet like this once, a while ago, that was extremely lacking when it came to real-world actions and layout. This is incredible.
dmmalam 3 days ago 0 replies      
Would love a stylus version to use with nodejs
brackin 4 days ago 0 replies      
I love this, good on Twitter for realising this. Making the web more beautiful.
alexis-d 4 days ago 0 replies      
I'm always happy when I see a big company releasing really nice stuff as free software, and that's the case. Congrats Twitter!
schiptsov 3 days ago 0 replies      
Why and how it is better than YUI? Because it is from Twitter? ^_^
tcderek 3 days ago 0 replies      
I really like Bootstrap as a tool to create a MVP. Will definitely be using it soon!
krsgoss 4 days ago 0 replies      
Nice work and thanks for contributing. Look forward to trying this out on a future project I'm working on.
mvts 3 days ago 0 replies      
Thank you so much for that framework. I just started implementing it in my latest app. Looks great! I'm struggling a bit with the modals implementation, but I'll get it to work.
jdelsman 4 days ago 0 replies      
Thanks, Twitter!
methane 4 days ago 0 replies      
It's such a bless for programmers.
harrisreynolds 4 days ago 0 replies      
This looks very cool... we need richer frameworks to standardized web dev!
ChrisArchitect 4 days ago 0 replies      
wow, this is interesting. What a contribution.
Solar-panel "trees" really are inferior googleusercontent.com
640 points by DevX101  3 days ago   98 comments top 25
raganwald 3 days ago 8 replies      
I didn't find it that harsh. He was direct and took pains not to ridicule a thirteen year-old for making an entirely age-appropriate mistake in measuring the results. Instead, he asked the perfectly valid question of how this becomes news without critical thought.

In that, the critique seemed hopelessly ignorant of how the news works. Why should science fair projects be treated any differently than crime, the personal lives of celebrities, politics, or economics? News outlets publish first and ask questions later or not at all. They have gone to court to defend their right to publish things they know to be false.

How did a confused science project become international news? Why, the same way that almost any overnight sensation becomes international news, by being digestible, by being something people want to be true, by appealing to their preconceived biases.

A commenter pointed out that this is the value of a peer-review process. And indeed, this result was published without peer review. So who is the fool here? The journalist for publishing without review? Or the reader who knowingly accepts the result despite it being published without peer review and/or corroboration?

raganwald 3 days ago 1 reply      
Coming back to this almost eight hours later, I ask: What is the problem here? On HN, we upvote articles that are interesting. My idea of a downvote is an article where I felt I lost IQ for reading it. In the case of the kid's mistaken result, is there any question his research and theory were interesting? Does anyone honestly feel stupider for having read about it and considered the possibility that he was correct?

Quite honestly, almost everything that makes it to the front page of HN is wrong. We talk about software development, startups, muse about whether Apple is brilliant or is lucky enough to have lame competition, argue about Haskell and Erlang... All stuff that is non-empirical and therefore unfalsifiable.

How does that stuff get a free pass to be on the front page of HN without peer-reviewed research backing it up? I'll tell you how: We're smart enough to know that all of that stuff is probably at least partly wrong, but if there's something in there that makes us smarter, it's worth reading and upvoting and discussing.

If the kid's ideas had in them just one thing that made us smarter for having thought about it... That's a win, that's worth upvoting and repeating. Why wait for peer review? As long as nobody ran out and dropped a million bucks on manufacturing solar arrays, what's the harm? If anything is wrong with an article, a day or so later, all of the flaws will be corrected. And thats exactly what has happened here.

Thinking about this, I don't see a problem with the “blogosphere” or with HN upvoting and tweeting and repeating the original article. Thank goodness we don't put everything through a peer-review first. I'd say things are working just fine, and I encourage every other thirteen year-old kid to experiment and publish.

No harm, no foul. There's nothing in there that's more wrong than anything I've ever said in a blog post or a comment, it's just easier to prove where empirical science is concerned. But even when they're wrong, my posts are useful if they help people think, and I suspect his post is useful for the same reason.

waterlesscloud 3 days ago 4 replies      
This doesn't address what I thought was the insight, which is about finding optimal placement and facing for stationary solar panels when the light source (the sun) is not stationary. Neither throughout the day or through out the year.

It doesn't seem impossible that some placements are better than all-in-one-direction, especially over time, and that's what I thought the experiment was about.

Am I misunderstanding something basic?


Going back and reading the kid's writeup at http://www.amnh.org/nationalcenter/youngnaturalistawards/201... shows some interesting details.

He's not using voltage as a proxy for power, he's using it as a proxy for "sunlight collected". There's two voltage graphs on that page that are machine drawn, not hand drawn. The point of interest in the two graphs is that the Standard graph has narrower peaks of voltage, whereas the Tree graph is broader. This represents the idea that the Tree was generating electricity over a longer period of time. NOT that it was generating more power, but that it was collecting sunlight for a longer period.

The 20% and 50% pie charts indicate this same idea. The percentages are hours, not watts or volts. 12.5 hours vs 8 hours in one timeframe, 13.5 vs 11 in another. Hours. Not volts, not watts. Time, not power.

llambda 3 days ago 1 reply      
It's refreshing to see a counterpoint to the MSM's enthusiasm for over-dramatization and poor fact-checking; this wasn't a breakthrough in the science of photovoltaics, as some headlines seemed to read. And I'm particularly happy that the author took great care to not target the boy. He should be encouraged to continue this kind of scientific pursuit and not be dissuaded by mistakes in the research. In fact this is a good example of how peer-reviewed research actually functions. That said, I feel the MSM only does him a disservice by misrepresenting the implications of his project as something more than it might be.
skrebbel 3 days ago 0 replies      
I don't think you can put this out there any less harsh than this. Plus, all the harshness that's there is towards stupid journalists, not to the kid. And, well, they deserve it.
DevX101 3 days ago 3 replies      
This article was posted last night, but taken down by this morning. I recovered it from google cache.
DanielBMarkham 3 days ago 0 replies      
I voted this up -- it read very well -- but I am a little uncomfortable about voting up an article the author clearly wanted to delete.
robryan 3 days ago 1 reply      
I wonder, if this idea come from a someone older without any real experience or qualifications in solar would it have been fact checked a little better before it ended up everywhere?

I would want to encourage young people as much as possible, it's just that some of the things you heard about are only considered impressive when those doing the story have factored age into it. If the person was say 30 it wouldn't even be a story.

callenish 3 days ago 0 replies      
When the blogger used the phrase "Fibonacci mysticism" it becomes clear he has far too much bias to pay attention to what the kid was actually doing.

He points out that voltage in solar cells is essentially boolean. Ok. That means that the kid has shown a way to orient the cells so they get sunlight for a longer period of time. If evolution is anything to go by, his approach likely reveals a local minima for maximizing the time that sunlight is collected.

Is that worthless? No, it is not. It may well explain why trees orient their leaves the way they do, and there could well be practical applications. If your solar array produces more than the peak power you need but the cost and energy loss of storing the power are significant, you may well want to use a pattern like this that mimics what trees do.

What impresses me is that the kid noticed something in nature he hadn't noticed before, read up on it to see what was known about this pattern, and then went out and measured to see whether what he had read was accurate. Once he had verified what he read, he then figured out how that knowledge might be applied. In doing so, he discovered something that, while obvious in hindsight, is not something that would necessarily have occurred to someone trying to figure out how to maximize the time for solar panels to deliver energy. I think the kudos are appropriate, even if the stories are misleading.

An important point here is that in any news story about something that you know a lot about, there are always errors. Always. We should all keep that in mind when we read or watch or listen to the news.

snorkel 3 days ago 0 replies      
Nonetheless I give the kid credit for exploring Fibonacci and biomimicry. Cool concept.
jshort 3 days ago 1 reply      
Regardless of Aiden's findings about solar energy his ability to see a pattern in tree branches and ten to attempt to answer the question why is impressive in my books.
ramy_d 3 days ago 0 replies      
is this like that time when some kid in india found a way to make solar cells out of hair?
nknight 3 days ago 0 replies      
Absent an explanation of why the author chose to delete this article, I can't bring myself to find it credible. It might be right, or it might be very wrong in a non-obvious way that the author finally realized, causing him to delete it. There are other reasons it might have been deleted, but I have no good way to judge.

Given that, I don't think it really adds anything to the overall debate.

njharman 2 days ago 0 replies      
> to orienting panels at sub-optimal angles

My understanding is tree is not at "sub-optimal" angles. But at varying angles throughout the day and seasons as sun moves. Sometimes those angles < than flat panel and sometimes those angles > than flat panel. And that over time the tree was more often optimal than the flat panel.

No one is gonna convince me that millions of years of evolution hasn't resulted in efficient solar collectors. But, from the original announcement the biggest issues I noticed was that actual tree leafs move during day and many trees just give up in winter. Neither of which was replicated in solar tree. Therefor solar tree very likely not as good as nature's trees.

its2010already 3 days ago 0 replies      
All I have to say is the guy writes pretty darn well for a 13-year old. Perhaps his experiment is flawed, but I was very impressed by his writing skills, and I know it will serve him well in the future.
justsomebloke 8 hours ago 0 replies      
If the young budding scientist is wrong then why hasn't nature followed our ideal 30 degree angle, he's certainly got me thinking?!
EGreg 3 days ago 3 replies      
Wait a second bro.

Trees don't rotate to face the sun.
In order to have an optimal angle throughout the whole day, it would have to keep facing the sun, i.e. rotate.

I believe he may have missed this point.

As for the voltage measurement being the wrong metric -- agree.

acex 1 day ago 0 replies      
walter: am i right?
dude: yes walter you're right you're also an asshole.
salem 3 days ago 1 reply      
I think one of the comments of the linked article made the best point, his science teacher should have caught the mistake.
Education and journalism FAIL...
realou 3 days ago 0 replies      
I think the point here is that Aidan's system has no moving parts and does not track the sun. If you accumulate the total energy produced by the array over a period of one year in, say, capacitors, then I am not certain at all that his tree configuration would not be better than a fixed array of cells.

Of course, the fact that the picture in his paper showed a bright white wall just besides his experimental setup removed credibility to the whole thing.

But this prevents no one from repeating the experiment with better control over the variables and who knows...

randomanonymous 3 days ago 0 replies      
You listened to the comments in the last post's comments (the one with only a couple comments).

This is MUCH healthier for a 13 year old, title wise, and article wise.

kubrickslair 3 days ago 1 reply      
A bigger point here is how we all mistook the results. There is always a struggle between extreme peer review, which leads to insularity, and going by the net real impact- 'don't publish, build'. In the latter approach, the winner often forgets whose shoulders he stands on. In the former, you cannot stand too high from your peers' shoulders, and there are few real winners- 'everybody is great, everybody wins /s'.
defdac 3 days ago 0 replies      
This reminds me of a newspaper in Sweden asking people on the streets 1) Do you care if children have made the merchandise you buy? 2) Do you do anything to follow up on it?
Where one woman responded 1) Yes, children have no sense of quality and it shows on the product. 2) No.
earbitscom 3 days ago 0 replies      
This post reads to me as if the person took personal offense to Aidan getting "undeserved" recognition. He claims to be angry at the journalists and scientists who didn't do their research before jumping to the conclusion that he had created something amazing, but the tone is just unnecessary.

> Some poor 13-year-old kid is all over the news

"poor 13-year-old" is hardly the words I use to describe a kid who got a bunch of press because they conducted an interesting science experiment, no matter how incorrect the end results of that experiment were.

> blindly parrot the words of this very misinformed (not to blame him, he's an unguided 13 year old) kid.

If a 45 year old scientist posted the results of an experiment that challenged the furthest reach of their abilities, and some scientists with more information or a different perspective explained why the experiment was conducted incorrectly or why the results were inaccurate, nobody would call the original person "very misinformed" or "unguided".

There are just a lot more ways to tell a 13 year old conducting solar panel experiments using concepts beyond most adults abilities the ways in which they could improve their experiment. The last thing you need to call a kid doing this kind of work is "very misinformed".

Kilimanjaro 3 days ago 0 replies      
It is not ridiculing those who fail that we advance science, it is about thinking differently even if we fail, for we learn and improve and someday we will succeed.

What if we use solar cells on one side of the leaf and mirrors on the other so they can reflect sunlight when the leaf looks east and the sun is going down west?

What if we use a sphere shaped cell? like a mango tree? or a cone shaped cell? like a pine tree?

What if we put solar trees on every sidewalk, even if they produce less energy, but are more ergonomic and easier to accomodate in our daily lives?

What if we sprinkle some water on that solar tree? What if we make it taller? wider?

Go on kid, continue your experiments, listen to those who give you sound advice, ignore the ones who only add noise to the harmonious melody of nature and science.

13-Year-Old Makes Solar Power Breakthrough by Harnessing the Fibonacci Sequence inhabitat.com
638 points by jedwhite  4 days ago   142 comments top 36
pigbucket 4 days ago  replies      
Inhabitat credits treehugger.com as its source. Treehugger's article is not breathless about biomimicry, not spread over two pages, and not interrupted by adsense and images.

Edit: The source of treehugger's article is Aidan's own article, which is better still, and addresses briefly some of the issues raised in comments here (e.g., about fixed vs. tracking pv arrays).

colanderman 4 days ago 4 replies      
I don't get what his results have to do with the Fibonacci sequence. Between his "control" and the design he was testing, he changed:

1. panel heights
2. panel angles
3. whether panels were stacked or not

I would guess that any of those three things matter way more than the position of the "leaves" following the Fibonacci sequence. He needed to compare his design to a similar tree-shape whose "leaves" were, say, uniformly or randomly spaced; not to what amounted to a patch of moss.

(Which brings to mind: solar panels which were shaped more like moss (i.e. rough) would probably perform even better. I'm pretty sure I remember MIT or some place building a prototype like that.)

Finally, he measured voltage but made claims about power, which is a huge no-no for solar PV. Solar PV panels have highly nonlinear voltage/current characteristics, which means that increased voltage does not correspond to increased power, especially in setups such as the tree where the solar panels are not uniformly illuminated.

felipemnoa 4 days ago  replies      
The comparison is against a flat row of cells that do not track the sun. I suspect that it will not do better compared to an array of cells that do track the sun.

Basically the tree of cells are arranged in different angles so that as the sun moves some of them will always be receiving optimal sunlight when their normal is parallel with that of the incident light.

Very nice insight, especially for a kid his age. I certainly would not have thought of it.

robinhouston 4 days ago 0 replies      
Towards the end of his life, Alan Turing spent some time trying to explain Fibonacci phyllotaxis. http://user29459.vs.easily.co.uk/wp-content/uploads/2011/05/...

I wonder if Aidan Dwyer is pleased by the thought that his scientific career is beginning where Turing's left off. I would be, in his shoes.

zacharyvoase 4 days ago 0 replies      
Everyone always talks about the Fibonacci sequence w/r/t the golden ratio, but in nature it's usually a variant on a Lindenmayer system: http://en.wikipedia.org/wiki/L-system
tiddchristopher 4 days ago 1 reply      
The description, "The study earned Aidan a provisional U.S patent," is misleading. A provisional patent is merely a completely automated recognition of your claim to an invention. You submit your provisional filing, and then have one year to file an actual patent, which is reviewed by patent examiners. You don't "earn" a provisional patent--you just pay a couple hundred dollars and submit a few forms.
marknutter 4 days ago 1 reply      
Is it just me, or do most articles about young kids doing intellectually notable stuff start out with something along the lines of "while most 13-year-olds spend their free time playing video games or cruising Facebook...."
scorchin 4 days ago 0 replies      
On Aidan's own article[1], you can see that he's referenced work that's guided him in making this breakthrough. Hidden in the bibliography is a Dr Suess children's book!

Geisel, Theodor Seuss (Dr. Seuss). The Lorax. New York: Random House Publishers, 1971.

[1] http://www.amnh.org/nationalcenter/youngnaturalistawards/201...

waterlesscloud 4 days ago 3 replies      
I wonder if some genetic algorithm style testing of angles and placements could yield even more efficiency?

Seems possible that nature hasn't yet hit optimal design in this area.

martinkallstrom 4 days ago 1 reply      
Wait... so there is a reason trees look like that? This was awesome, all the more for being a discovery by a 13-year old kid.
Protagoras 4 days ago 1 reply      
A couple of points:

1. Like others said this is in comparison to a flat non tracking solar panel, the tree configuration would lose out significantly against a tracking panel.

2. Fairly disingenuous graph on the second page, but then again professionals in business and science do this all the time as well.

3. With the current state of solar technology this patent is useless. But if someone invents solar cells who are so cheap that they cost less than the solar tracking equipment, this could become quite a lucrative patent.

4. I wasn't aware you could patent things which are this directly copied from nature. I was under the impression that you could say patent a mechanism which emulates the motion of a specific fish but not the motion itself, or can you ?

tripzilch 4 days ago 0 replies      
Explanation why this actually doesn't have a lot to do with Fibonacci from the other thread:


Additionally explains/shows that the Fibonacci sequence, or the golden ratio, both do NOT generally occur in nautilus shells, spiral galaxies, ancient design principles, body ratios nor are they perceived as significantly more aesthetically pleasing than other ratios of small numbers.

cperciva 4 days ago 3 replies      
He "discovered" the Fibonacci sequence in how trees branch? Seriously?

Maybe I'm just being a grumpy old guy here, but when I was in school this was in our math textbooks as an example of how the Fibonacci sequence appears in nature.

colanderman 4 days ago 1 reply      
Apparently in the "flat" design, half of the solar panels are on the back roof of the model house, facing his actual house, and thus likely not getting any significant light at all.
tintin 4 days ago 0 replies      
http://farm4.static.flickr.com/3206/2807030740_25f3f2fa53.jp... This 'tree' to charge your phone was designed in 2008.
I wonder if it charges better than a flat design.

Search for "Solar Powered Bonsai Tree".

pge 4 days ago 0 replies      
While this is interesting, the most important metric for solar power is not Watts per square meter but Watts per dollar of production cost. Complex configurations like this may be more efficient from a W/m2 but unlikely to be more efficient from a W/$ perspective.
ForrestN 4 days ago 0 replies      
It seems like trees are trying to maximize the density of leaves they can accommodate, balancing that against the decreasing usefulness of each leaf. I suspect using the tree placement you could produce much more energy per square meter of land, because you could fit so many more panels.

Imagine if a tree's leaves were arranged in a grid. The footer would be enormous. If you are making a solar farm. In the desert with sun-tracking panels, I don't know how much this improves things, because there isn't much limit on land. But in a city, on rooftops, in backyards, etc, you might be able to get a lot more total energy out of a given plot this way.

auston 4 days ago 1 reply      
Question: Does his tree have nearly double the number of panels? If so, does that have anything to do with him getting more power?


scdc 4 days ago 0 replies      
They should combine this with the cell-tower that looks like a tree. Could reduce the cell tower's electricity draw.

Not sure these exist everywhere. Here is a Google Image search: http://www.google.com/search?q=cell+towers+look+like+tree...

joshaidan 4 days ago 1 reply      
He needs to do the same experiment during the wintertime when the angle of the sun changes. You won't get accurate results until the experiment is performed year round. Different angles of inclination perform better at various times of year, and there are some thoughts that multi-inclined arrays average the same output as uniform arrays.

Remember, a tree only has leaves in the summertime, not the winter. :)

TeMPOraL 4 days ago 0 replies      
A lot of comments mention tracking the sun. I'd like to remind everyone about SolarFlower.org - the open source solar collector with a clever, non-electronic sun tracking system. See http://www.solarflower.org/faq.htm.
bobds 4 days ago 0 replies      
I wonder if this concept could somehow be applied at a microscopic level.
pedalpete 4 days ago 1 reply      
I find it amazing that with all the interest in bio-mimicry, this hasn't been tried before. Did nobody ever ask why all plants share a very similar architecture?

They've even made solar cells that look like leaves before http://techon.nikkeibp.co.jp/english/NEWS_EN/20080527/152443..., but nobody bothered to test if a tree-like structure gathered more energy.

winsbe01 4 days ago 0 replies      
i think this is great. not that he was 13 (though it is impressive), but that he thought enough to challange the typical panel array that we've gotten solar power from in the past. it seems that he got some interesting results, too. sure, maybe he didn't take some things into account, or maybe it's not super practical, but breaking out of the mold of large, 2d rectangular panels may be something the solar energy world needs to innovate on ways to harness more energy from the sun efficiently.
jshort 4 days ago 0 replies      
Trees have been attempting to capture the suns energy for a long time and I'd like to see a comparison to other plants efficiency at capturing energy. Nature is powerful.
RobertHubert 4 days ago 1 reply      
Cant get much better than millions of years of try and die I guess. Mother nature is pretty crazy! We should copy her work more often :)
redthrowaway 4 days ago 2 replies      
So perhaps this is a question that others have answered, but what springs to mind for me is: why? I get that trees that follow the Fibonacci sequence are more productive, but without an answer to why that is, it remains a bit of a kludge. I would love to see some explanation of why this configuration is optimal.
ck2 4 days ago 1 reply      
If he wanted better mainstream press he should have thrown the word "fractals" in there. I bet most reporters know the word fractals than Fibonacci sequence.
jagtesh 4 days ago 0 replies      
First genuinely interesting article I've read ever since M.G.Siegler hit Techcrunch
jsg 4 days ago 0 replies      
I wonder what Eden Full (http://www.odec.ca/projects/2006/full6e2/index.htm) thinks about Aiden's project...
doyoulikeworms 4 days ago 1 reply      
This is much harder to keep clean than a simple, flat array of panels.
benmlang 4 days ago 1 reply      
Need more 13 year olds like that.
tomp 4 days ago 1 reply      
Now that is something that really deserves a patent!
prtk 4 days ago 0 replies      
NASA designed antenna using genetic algorithms. This guy can use GA to optimize his solar-fibonacci-tree further.

Way to go kid! The force is strong with you! Best of luck! :)

tete 3 days ago 0 replies      
It is a miracle that curiosity survives formal education. -- not Albert Einstein (according to Wikiqoutes)
sarabob 4 days ago 0 replies      
The panels on the tree are higher than those on the flat plane - you'd expect to get more light for longer from that alone.
Spin.js, a pure JS spinner github.com
534 points by michokest  5 days ago   66 comments top 20
jsdalton 5 days ago 3 replies      
I think the possibility of dynamically changing the speed of the spinner is interesting. If your spinner was representing a file upload, for example, you could conceivably adjust the speed based on the current upload rate.
seanalltogether 5 days ago 10 replies      
I hate to be that guy but this takes up 45% cpu under firefox for a simple spin animation.
Nycto 5 days ago 4 replies      
A spinner generated from http://www.ajaxload.info/ is 673 bytes. The minified javascript from this is ~3K. I suppose the trade off is features and flexibility, but I don't find myself needing much out of my ajax spinners.
JohnnyBrown 5 days ago 0 replies      
Not realizing what was meant by "spinner", I spent a few seconds waiting for the cool javascript demo to load before I realized.
rawsyntax 5 days ago 1 reply      
This is cool just for fun, but practically there are sites like http://ajaxload.info/ that generate spinners for you, if you don't know how to do it yourself
dalore 5 days ago 1 reply      
What's the battery drain like?
dw0rm 5 days ago 1 reply      
Are there any GIF loader generators that have all this options (speed, sizes, color) available? Could be good to use this as a preview, and then generate the final GIF.
pavel_lishin 5 days ago 0 replies      
Tested it on my iPhone 3gs - after some scrolling around and zooming (trying to reach the controls) the spinner "locked up" - it's no longer spinning, every bar is simply pulsing together in time, like a luminescent jellyfish.

I wonder if this could happen in a regular browser, too?

alanh 5 days ago 0 replies      
It was noted in this thread that this consumes a relatively large amount of resources, and it was suggested that flipping through PNG frames would be less CPU-intensive.

I have a hunch that's correct, but don't know.

I do however have an unanswered question on Stack Overflow seeking, ideally, a generator of JavaScript + PNG throbbers. First one to make ajaxload.info with PNG sprites and/or Canvas generation in supported browsers wins! (No need for most of the hideous ajaxload designs though.) http://stackoverflow.com/questions/6937149/best-practice-too...

Sephr 5 days ago 0 replies      
The only reason to use this is for scalable throbbers, and since you're only going to configure it once, it'd be much easier and more compact to just use a generated SVG+SMIL instead.
ck2 5 days ago 0 replies      
My gif spinner is 1152 bytes, not sure what you are using that's taking much more.
justincormack 5 days ago 1 reply      
Ugh. Surely thats what CSS transforms and transitions are for. Javascript animations should be a last resort...
Jarred 5 days ago 0 replies      
Surprinsingly, it worked on my Samsung Focus.

It was spinning very slowly, but the Mango update supports CSS3 well it seems.

uast23 4 days ago 0 replies      
Read through quickly at first and got mislead by 'target' assuming that anything passed as a target will start spinning and tried to spin an image :), which of course is not the case. Nice effort though.
minikomi 5 days ago 0 replies      
For those who want something minimal.. http://jsfiddle.net/4mrCU/2/embedded/result/
nicklovescode 3 days ago 0 replies      
This would be perfect for canvas games. It means that image.gif doesn't have to load, which typically requires initializing a new Image() and attaching an onLoad function. This simplifies all that.
jdelsman 5 days ago 0 replies      
Best thing I've seen all week. Yet another nail in the coffin for image assets on the web. Thanks, CSS3!
paisible 5 days ago 0 replies      
Very cool - love the way it's configurable, good job.
dougaitken 5 days ago 0 replies      
Amazingly IE7 on XP (work PC) handled that like a trooper! Good job
wyclif 4 days ago 0 replies      
Resolution independant


I am nothing paulbuchheit.blogspot.com
493 points by dwynings  3 days ago   126 comments top 65
edw519 3 days ago 2 replies      
...you shouldn't compare yourself with others -- you didn't start in the same place or with the same challenges...

Reminds me of this:

Reb Zusha was laying on his deathbed surrounded by his disciples. He was crying and no one could comfort him. One student asked his Rebbe, "Why do you cry? You were almost as wise as Moses and as kind as Abraham." Reb Zusha answered, "When I pass from this world and appear before the Heavenly Tribunal, they won't ask me, 'Zusha, why weren't you as wise as Moses or as kind as Abraham,' rather, they will ask me, 'Zusha, why weren't you Zusha?' Why didn't I fulfill my potential, why didn't I follow the path that could have been mine."

coryl 2 days ago 1 reply      
Study any martial art (preferably one where you can spar) and you'll forever understand the meaning of ego. You show up, do drills/padwork, think you're making progress and then spar someone who beats you up. You do the same thing the next day, and the next, and the next. Eventually you improve and make progress because you're training so hard. But there's always that one guy you can't beat, and that new kid whose far beyond his years in talent. Coach is also now telling you your making mistakes on things you thought you put behind you, and its frustrating the hell out of you that you can't fix it fast enough. Your technique isn't up to par, your cardio is garbage because you had pizza and beer, and your training partners are running circles around you.

That's when you realize you know nothing, that even after these years of training and experience, you feel even less knowledgeable about the art than when you began. Depressed about your progress, you figure you have two options to deal with it: 1) quit... or 2) keep showing up. But by now, you love it too much and its become apart of your life, so quitting isn't an option. All that's left to do then is to continue showing up.

Eventually, the ego is beaten out of you from every failure/loss/disappointment in your daily training. You've tapped out to newer people, younger people, smaller/bigger/"dumber" people that it doesn't even shock you to perform poorly against a total beginner. From here, self-realization naturally guides you into a more focused path for self-improvement. What you want to achieve today is far different from what you thought you'd wanted out of martial arts in the beginning. Telling apart someone who thinks they "know", and someone who truly "knows", is far easier. You'll realize how little you know, as it will humble you. But hopefully you'll come to peace with who you are, and realize what it takes to be where you want to be.

jmtame 3 days ago 2 replies      
Reminds me of a lecture given on Beginner's Mind that emphasizes the pitfalls of intellectualism:

'Can we look at our lives in such a way? Can we look at all of the aspects of our lives with this mind, just open to see what there is to see? I don't know about you, but I have a hard time doing that. I have a lot of habits of mind"I think most of us do. Children begin to lose that innocent quality after a while, and soon they want to be "the one who knows." We all want to be the one who knows. But if we decide we "know" something, we are not open to other possibilities anymore. And that's a shame. We lose something very vital in our life when it's more important to us to be "one who knows" than it is to be awake to what's happening. We get disappointed because we expect one thing, and it doesn't happen quite like that. Or we think something ought to be like this, and it turns out different. Instead of saying, "Oh, isn't that interesting," we say, "Yuck, not what I thought it would be." Pity. The very nature of beginner's mind is not knowing in a certain way, not being an expert. As Suzuki Roshi said in the prologue to Zen Mind Beginner's Mind, "In the beginner's mind there are many possibilities, in the expert's there are few." As an expert, you've already got it figured out, so you don't need to pay attention to what's happening. Pity.'

alecst 3 days ago 0 replies      
Reminded me a little of this quote from The Picture of Dorian Gray:

Lord Henry stroked his pointed brown beard and tapped the toe of his patent-leather boot with a tasselled ebony cane. "How English you are Basil! That is the second time you have made that observation. If one puts forward an idea to a true Englishman -- always a rash thing to do -- he never dreams of considering whether the idea is right or wrong. The only thing he considers of any importance is whether one believes it oneself. Now, the value of an idea has nothing whatsoever to do with the sincerity of the man who expresses it. Indeed, the probabilities are that the more insincere the man is, the more purely intellectual will the idea be, as in that case it will not be coloured by either his wants, his desires, or his prejudices. However, I don't propose to discuss politics, sociology, or metaphysics with you. I like persons better than principles, and I like persons with no principles better than anything else in the world. Tell me more about Mr. Dorian Gray. How often do you see him?"

zupatol 2 days ago 1 reply      
Thirty or forty years ago, people apparently had much stronger identities, according to an old french psychologist I heard on the radio. People were much more inclined to think of themselves as 'a communist' or 'an artist'. It's true that these thoughts are prisons, but not having them sends people to the psychologist for other reasons. Unfortunately I don't remember what he said about these new problems.

I guess one problem with discarding your identity as a father is that you are also questioning responsibilities that are vital to your children. Another problem of going without identity is that having a sense of belonging to some community becomes more difficult. That's something I have been missing personally.

dpritchett 3 days ago 2 replies      
Looks like a riff on "Keep Your Identity Small":

If people can't think clearly about anything that has become part of their identity, then all other things being equal, the best plan is to let as few things into your identity as possible.


nevvermind 2 days ago 3 replies      
I don't know much Zen, but after the "Zenish" comments in here, I guess lots of HN-ers do, right?

I maybe am a case of "western individuation syndrome", but for me, loosing individuation for protecting myself sounds rather like an oxymoron. Actually, when there's nothing to protect, there nothing to improve, whatever that guy says. And even if you might say that Buchheit didn't suggest to actually dissolve your personality, it very much seems like he did.

Low expectations, realism when faced with your own challenges, sane aggressiveness or indifference, confronting your own prejudices, vices and frustration, are all possible when one's brain is mentally trained, not when you're living with a general sense of "loosing yourself" or when one's dissolving one's ego.

I don't think "letting go of your identity" leads to "a better version of our selves" or "true self improvement", but to a toxic sense of not being who you are.

"But I am nothing, and so I am finally free to be myself." - if you need to be nothing in order to be free, you ARE nothing.

"By returning to zero expectations, by accepting that I am nothing, it is easier to see the truth." - what do zero expectation has to do with nothingness?

"If I were smart, I might be afraid of looking stupid." - that's not being smart, that's being westernly-smart. Change "smart" with "wise" and see if that sentence makes any sense.

Why does someone always preaches extremes to get rid of another? Now I'm being artistically literal: when you say "I am nothing", just loose the "I".

Hey, I had frustrations, I had problems and conquered must of them, partly with indifference, partly with matured ego, partly with higher self-barricades, but not once I thought of dissolving my ego. What the deuce? - I kinda need it! I was learned to fight and gain knowledge, but then I learned that, when fighting something, you actually give it meaning, so then I learned to give up. So this blog post does resonate with my experience at some extent.

Preparing for "He didn't mean to ACTUALLY renounce your personality": it's dangerous to use metaphors or ambiguous expressions when your next paragraph if a plain-life description. Just don't. Use "lower your expectations" instead of "be nothing". This is logic/biologic-ally wrong. You can't be nothing, but you can't be all of it, neither, so, in the mediocrity principle, just be something, because you already are (how's that for metaphoric?).

sidman 3 days ago 1 reply      
I think there is a fine line you need to tread. I agree with what paul has to say but sometimes what we care about is what drives us.

For the longest time i worked as a consultant but i never cared. When someone said to me so what do you do, i just said i work with computers, my rank didnt matter to me, my role , my status nothing. I wouldn't come to work in a suite or tie or even a shirt (i came in just a t-shirt and jeans) and it was very weird to many who were watching. I guess cause they cared and wondered why i didnt ...

They would say , "your a consultant, for a big4 how can you wear those clothes, cause i personally cant".

I would respond by saying, well i dont care , just wanna make sure customer is happy regardless of what my official role is, that way i could do my work , not have to redo things and then go home. Also one of the important things i think is i detached myself from being a consultant, i was nothing, not a manager, a consultant, a senior consultant , who has certain things attached to them so i felt free and just wore what i wanted with the one rule that the customer needs to be happy.

When i found out that i could get away with that i experimented with a few other things too, like if i was tired during lunch i would sleep on the park bench if it was a nice sunny day. I stopped thinking hey im a professional and cant be seen sleeping on a park bench cause thats what bums do and once i got over it and i thought , hey who cares, it was easy and i would come back to the office invigorated cause of a 20-30 minute quick nap :)

However the caveat is when you CARE enough about say programming . If you start to tell yourself you dont care anymore you loose a certain desire which if you care about it, isnt very good. if you keep telling yourself hey i am a programmer then that comes with certain things such as , writing code, being half decent at math, being logical etc etc and being good at those things isnt a bad thing.

So i think i get what paul is saying when he says "i am nothing" but i think you cant apply that to things you care about cause it will cause you to not care. But applying that way of thinking to things what you might care about but deep down know its just for perception or is kinda silly or even not really important has some surprisingly good results :)

maayank 2 days ago 0 replies      
This reminds me of the chapter about "The Cosmic Joke"[1] from a Tibetan Buddhism book[2].

"[speaking about ego] We set up a background, a foundation from which we can go on and on to infinity. This is what is called samsara, the continuous vicious cycle of confirmation of existence. One confirmation needs another confirmation needs another…

The attempt to confirm our solidity is very painful. Constantly we find ourselves suddenly slipping off the edge of a floor which had appeared to extend endlessly. Then we must attempt to save ourselves from death by immediately building an extension to the floor in order to make it appear endless again. We think we are safe on our seemingly solid floor, but then we slip off again and have to build another extension. We do not realize that the whole process is unnecessary, that we do not need a floor to stand on, that we have been building all these floors on the ground level. There was never any danger of falling or need for support. In fact, our occupation of extending the floor to secure our ground is a big joke, the biggest joke of all, a cosmic joke."

[1] The chapter is fully available here: http://www3.telus.net/public/sarlo/Ytrungpa.htm.

[2] http://www.amazon.com/Myth-Freedom-Meditation-Shambhala-Libr...

dporan 3 days ago 0 replies      
What a wonderfully thought-provoking and inspiring piece. Thanks, Paul, for sharing it.

In a similar vein, from David Foster Wallace's 2005 commencement address at Kenyon College:

"If you worship money and things -- if they are where you tap real meaning in life -- then you will never have enough. Never feel you have enough. It's the truth. Worship your own body and beauty and sexual allure and you will always feel ugly, and when time and age start showing, you will die a million deaths before they finally plant you.... Worship power -- you will feel weak and afraid, and you will need ever more power over others to keep the fear at bay. Worship your intellect, being seen as smart -- you will end up feeling stupid, a fraud, always on the verge of being found out. And so on."


mrphoebs 3 days ago 2 replies      
An oversimplification

suffering = your self image vs perception of reality self

The argument goes that man is a prisoner of his self image. This self image is a mixture of his desires, wants, tastes, hopes, fears.... This can be seen as a self image that arises out of conditioning by the society and self. You are like a frog in the well and imagine the well to be the universe. You are limited and shaped by the well. How can one know of what the possibilities are unless they ascend from their own intellectual/egotistical/scoietal wells or the well of self image?

On the other hand, I have noticed that rejection of natural tendencies leads to suffering as well. No matter how hard we try the self will never be a blank tape. When you reject the self image your self image becomes "Im he/she who rejects self image imposed on me". So now you are straight back where you started with a brand new self image, only this time you are more observant of your flaws(tendencies of self). So there is still suffering here.

Let me oversimplify again
the frog = Neo in the matrix,
Ignorance is bliss = Cipher in the matrix

ristretto 2 days ago 0 replies      
I 've never seen so many "reminds me of" comments on a post before.
bfe 2 days ago 0 replies      
Excellent. Any true hacker must understand every level of the development process, and that includes one's own mind, standing back and questioning the inner thought processes that one normally thinks of as the self.
amirhhz 3 days ago 1 reply      
A recurring theme from great thinkers (perhaps mainly more in the East, though) over the ages. Glad to see it finding an audience on HN.

This excerpt from Rumi seems apt:

  "Knock, And He'll open the door
Vanish, And He'll make you shine like the sun
Fall, And He'll raise you to the heavens
Become nothing, And He'll turn you into everything."

My take on this line of thought is that as long as you consider yourself as being a "someone" or "having a self" you are always in conflict with other selves and only if you become nothing you remove the inherent conflict.

BasDirks 2 days ago 0 replies      
The misinformation and misinterpretation of Buddhism and Western philosophy in the comments is embarrassing, as well as the pop-spirituality babble.
davidhollander 2 days ago 1 reply      
Premise: we are what we think about. Being nothing requires one to think about nothing. It's actually quite a lot of work to think about nothing! The brain is constantly solving problems while awake and while asleep, building up momentum.

Proposal: Instead of expending massive amounts of energy bringing an object of such high inertia to rest, why not just change the inputs you are feeding it to gradually alter its direction? Don't focus on the cessation and extinction of the turning of mind. Focus on feeding the turning of mind solvable or aesthetically pleasurable problems, to decrease the bandwidth occupied by unsolvable\fear based problems.

tldr: I'm not convinced the epicness of ego destruction is necessary for ego transmutation, if that is one's goal.

edit: Well, I just realized my analogy does not hold for all cases. In Physics, if you want to change the direction vector of an object in motion, to the opposite of its present heading, it will require at least as much energy as bringing the object to rest. So focusing on ego destruction could be worth it depending on where you want to go and where you are now.

Additionally, we also know from Physics that all motion is relative, velocity cannot be measured without a frame of reference. I think the Buddhists would argue that cessation of the turning of thought provides this otherwise missing frame of reference, enabling the thought\ego vector to accurately be measured when the turning of thought restarts.

DanielBMarkham 2 days ago 2 replies      
This is good. Reminds me of my observations on Hugh Everett's life and my conclusion to believe in anti-solipsism. http://www.whattofix.com/blog/archives/2007/11/the_first_ant...

I'd encourage Paul to take the next step, and realize that not only are you nothing, what you do really doesn't matter at all (Perhaps very difficult for Paul to believe, given his accomplishments! But true anyway)

Once you realize you're nothing, and what you do in life won't really matter -- life is fundamentally and irrevocably absurd -- then you can really be free to make the most with what you have. Because just realizing you transcend labels doesn't take the existential pressure off until you realize you also transcend existence itself -- you are truly and deeply nothing. At that point, you realize that the decisions a person makes is the only thing they truly own. This is the beginning of freedom.

It all sounds a lot like existentialism 101. Good stuff!

joseakle 1 day ago 0 replies      
...And it starts with nothing. ...

I disagree, it starts with love.

What is good?

Having zero expectations and being humble are both noble. But being nothing is just impossible. You already are something. Goodness comes from love, Integrity comes from knowing what is good, so don't be evil, be good.

What is true?

I guess the point being made is that knowledge of the truth starts with knowing i am ignorant, like the beginner's mind, or a childlike curiosity, free of prejudices, a free mind, or to paraphrase Plato "" This man, on one hand, believes that he knows something, while not knowing [anything]. On the other hand, I - equally ignorant - do not believe [that I know anything]." [1]

1. http://en.wikipedia.org/wiki/I_know_that_I_know_nothing

Eliezer 3 days ago 0 replies      

  I have abandoned my path.
I have forsaken my role.
I have forgotten my name.
I have lost my soul.

-- unpublished fiction

mjijackson 3 days ago 1 reply      
I am a husband. I am a father. I am a child of God.

There is a fine line between putting yourself in a box that you (or others) create for you and knowing who you are. The classifications that Paul lists in his post are of the first kind. The second kind, you can't really change.

jonmc12 2 days ago 0 replies      
Love the post, but nothing? I say adopt 2 or more senses of self, and get really good at using them at the right time.

I think moving beyond an ego-based sense of self is the best way to think rationally and accomplish a goal. In fact, the biggest benefit is probably being able to rationally understand other people's point of view without a sense of self clouding up your interpretation. But, for me at least, it has not been pragmatic to abandon a sense of self at all times.

For instance, it is much easier for me to relate to my grandmother with a stronger sense of self - or at least project a persona that gives that appearance. You can't really like people or things (in the most basis sense) without ego. Nor, can you fully extend emotional empathy in the purist sense when you have no sense of self. With no ego there are many professional environments that will simply drain your energy - even if you have a decent persona for relating emotionally with others.

For me, the better answer has been to look at my self (more specifically my brain) as a library of different selfs. Yes, I am nothing, but I must at least be a controller that responds to my environment with the most relevant sense of self at any point in time. I would suggest there is an evolution of self beyond nothing.

srjk 3 days ago 0 replies      
I am rarely moved to comment on a post. This one seems to be especially thoughtful and sincere.

I understood the core message to be we shouldn't let labels that define some aspects of who we are constrain us.

Or, in programming terms: mixins not class hierarchies :)

Favourite quote: "True self improvement requires becoming a better version of our selves, not a lesser version of someone else."

vannevar 1 day ago 0 replies      
Bliss is overrated. The greatest achievements often come from discontented, even tormented people. I for one don't want to live in a world where every book reads like Deepak Chopra.
pygorex 2 days ago 0 replies      
> But I am nothing, and so I am finally free to be myself.

But if I am nothing what entity is contemplating my nothingness? I most definitely must be a something - thinking of myself as a non-something is necessarily a delusion. It's illogical to engage in identity denial - instead I should try to engage in identity variation, approaching myself (and others!) with a different set of assumptions from time to time (which I believe was the spirit of the original post). It's silly to start a process of self-actualization by denying the very thing I am trying to actualize.

Or, as follows: I exist as a unique locus of space and time and so do you. I can share a room, a table, a meal, a conversation, even a lifetime with you but I can't be you - I can only be myself experiencing you. Your identity is yours and yours alone and you are always free to be yourself. In fact you are required to be yourself - after all who else could you be?

I intend to own my identity for the brief flickering moment that it exists. The mindless and momentous machinations of the universe have create the fragile consciousness that I am, and very soon these same machinations will erase me to nothingness. I see no reason to get a head start on being nothing.

thedigitalengel 2 days ago 1 reply      
Reminds me of Tyler:

It's only after you've lost everything that you're free to do anything.

badclient 2 days ago 0 replies      
No so simple. Having no identity can pose its own set of problems.

Paul addresses the problems posed by over-committing to your identity. But if you do run with the title of his post and remove all identity, you don't automatically get a self that is eager to learn new shit. That is the best case scenario.

You can just as easily be the guy who knows a lot - but lacks confidence to progress because he doesn't think he is a good developer or plumber. Or you can be the guy who doesn't know much and thus doesn't have an identity.

We've all met people who know exactly who they are("i am a kickass ruby developer beyond anything else in life") and we know people who know shit load of something but continue to ponder who they really are("kickass ruby developer professionally but really what am i?"). Both, at certain extremes can be equally harmful.

corin_ 3 days ago 1 reply      
This isn't relevant to the actual message of the post, but something that caught my eye was in the paragraph about thinking you are "too X to be Y". Most of them make sense, and I can understand people thinking the Y because of feeling the X. Except these two:

  too effeminate to be straight

too smart to be kind

Am I being foolish or do those two not fit? I can imagine someone thinking "I'm too sensitive, that's not how a man should be", I can't imagine someone thinking "I'm effeminate, I guess I can't be straight after all".

shadowmatter 3 days ago 1 reply      
Good post. Reminds me of that quote by Oscar Wilde: "Be yourself; everyone else is already taken."
crizCraig 3 days ago 1 reply      
I find not caring about what people think ends up hurting my personal relationships. I have to try really hard to be aware of my identity and how that fits in with the people around me. Naturally I am aloof. Is that what this article is saying I should be?
greengarstudios 2 days ago 0 replies      
Reminds me of this:

"Remember, dear brothers and sisters, that few of you were wise in the world's eyes or powerful or wealthy when God called you. Instead, God chose things the world considers foolish in order to shame those who think they are wise. And he chose things that are powerless to shame those who are powerful. God chose things despised by the world, things counted as nothing at all, and used them to bring to nothing what the world considers important. As a result, no one can ever boast in the presence of God."

- 1 Corinthians 1:26-29 (NLT)

zitterbewegung 3 days ago 1 reply      
This is why I like to do science. I feel like I am constantly learning. I know that I know nothing and I can only know a little bit more by science. When I acknowledged my limitations and just sat down and calculated then I performed better. Sort of a zen like way of thinking. I know that I know nothing and instead I must just calculate and manipulate symbols.
skrebbel 1 day ago 1 reply      
I see the sense of this, but I find it very difficult to accept.

Anyone else struggling with this?

nickmolnar2 3 days ago 1 reply      
I'm also reminded of Daniel Dennett's secret to happiness: find something bigger than yourself and devote your life to it. That too allows you to drop your baggage and become 'nothing'. And it's a great way to leave a lasting impact on the world too.
klbarry 2 days ago 0 replies      
It seems to me quite difficult to follow the philosophy described in this essay. It also seems to me an undesirable life, though I can see why a person would want it. Much satisfaction in my life comes from seeing myself as a "good" x, and I experience very little dissatisfaction from these labels. Perhaps this will change as I age.
dkurth 3 days ago 2 replies      
On the one hand, he says, "if we aren't changing for the better, then we are just slowly decaying," suggesting that there is such a thing as "better."

On the other hand, he talks about "returning to zero expectations" and being "nothing."

I conclude that Paul wants to be a better nothing. Or possibly that he is slowly decaying. I'm not sure you can be nothing and also have a standard for getting better.

dave1619 1 day ago 0 replies      
The question is: are you genuinely trying to become nothing or are you trying to become nothing in order to become somebody by accomplishing something, which means that you really aren't nothing.
euroclydon 2 days ago 0 replies      
Great essay! I'm definitely going to identify myself with others that value the "I am nothing" mantra. I hope I don't begin to think I am something, that could cause anxiety.
JDulin 3 days ago 0 replies      
Awesome essay from Paul, very insightful.

This mindset is good at not only helping yourself become yourself, but placing the best people in your life.

If you spend your entire life trying to please everyone you meet, decide who you should be friends with and who you shouldn't, and become a person that others will like more, you will ultimately not make friends who you would have by just being yourself. And those are the friends you want most.

Perhaps even worse, you could waste endless amounts of time on people who you think you should be friends with or have in your life, but really shouldn't. Sooner or later, the relationships with these people that you built on the foundation of some artificial idea of yourself will crumble. If you know that you are nothing, then become yourself, you will be surprised how many amazing friends you will find in your life.

username3 1 day ago 0 replies      
If I were nothing, I might be afraid of being something.
dan-k 2 days ago 0 replies      
These lines of thinking that involve rejecting an entire set of propositions about something always end up devolving into something like Russell's paradox if you take them to their natural conclusion. In this case, being in a state of true nothingness would preclude the possibility of considering nothingness an ideal state to be in, as that is in itself an individuating characteristic. It's the same problem that arises with pure moral relativism, which is itself an absolute moral position.

I'd be willing to bet that human nature is as complicated as the system of natural numbers; perhaps we keep butting our heads into walls like this because we're trying to find answers that would violate Godel's incompleteness theorems...

rinkjustice 3 days ago 0 replies      
I can relate. I too am trying to abandon my self and my "brand" because it's spiritually suffocating. I don't want to care what - oh, someone just voted up my last comment!

It's on the todo list anyway.

maeon3 3 days ago 3 replies      
This is a major component of Christianity. (I'm not preaching here) I thought it was interesting that the non religious world is figuring out things that have been core teachings for the last 2800 years.

In the christian world, it is called "taking yourself off the throne and putting another entity on it". The entity that gets put on it is variable, but the constant is that you are not on it.

mannicken 2 days ago 0 replies      
I agree. Having ego is so outdated. Ego is like believing that blacks are an inferior race, or earth is flat, or earth is 6000 years old.

In other words, "apple is being eaten by you just as much as you are eating the apple". It's not so much that you have chosen to eat the apple, as much as the apple chose to hit the receptors in your brain that will make you eat it. But it's all ridiculous: you and apple are one harmonious system.

foysavas 3 days ago 0 replies      
There is clear resonance between his main point and Kazantzakis's famous epitaph:

"εν ελπίζω τίποτε. "εν φοβούμαι τίποτε. •ίμαι λεύτερος.

I hope for nothing. I fear nothing. I am free.

That said, awesome post.

akivabamberger 2 days ago 0 replies      
Here's the thing, though: as long as others see themselves according to some paradigm, they will likely see you in relation to them and cast you in some social role.

In social interactions, we assume some identity to relate with those around us. Inevitably, those interactions (if we're receptive and respecting of who we interact with) will affect our own thoughts, including (sometimes) our sense of self.

No man's an island, and no man's nothing so long as he lives in a society. Saying "I am nothing" is just as bad as saying "I am X"

symptic 2 days ago 1 reply      
As I read this, I kept thinking of the movie American Beauty. If you found this post through-provoking or disagree with it, watch this movie.
wolfhumble 2 days ago 0 replies      
To be completely honest I stopped reading when I saw the picture of a man with a big automatic weapon.

Well known person or not, advices and weapons do not go together.

waffle_ss 3 days ago 1 reply      
Having a realistic sense of self is great; diminishing your self into a nothingness (i.e. totally altruistic) is not something to be admired. Then again, maybe I've read too much Ayn Rand.
marknadal 2 days ago 0 replies      
Meanwhile Steve Jobs, Bill Gates, Larry Page, and Mark Zuckerberg are focusing on making the world a better place, not on being nothing.
tmsh 2 days ago 0 replies      
You may enjoy early Socrates if you haven't already.
YuriNiyazov 2 days ago 0 replies      
I am kind of curious as to what prompted this post now than at any other time.
brok3nmachine 2 days ago 0 replies      
This post is great, and from reading the comments, I'm happy to see plenty of people being touch. So not having read the Zen Buddhist book on my shelf, I do have this to say... I am not nothing, I'm amazing. While I may disagree with others, they are beautiful and I have much love for them. Nobody is nothing.
drungli 2 days ago 0 replies      
I'm sorry, but this article is saying a lot of nothing... and I felt like I wasted my time when finished reading it.
surrealize 2 days ago 0 replies      
It's great to let go of labels--it helps you stop worrying about whether you're being a good enough <whatever>. But saying that you're "nothing" isn't a good way to put it. You're still something, you just don't have to worry about being a particular, externally-defined thing.
rooshdi 3 days ago 0 replies      
Inspirational read. Our own insecurities mean nothing in the end. Enabling others to erase theirs means so much more.
hussong 2 days ago 0 replies      
Your genuine humility and down-to-earthness never cease to amaze and inspire me.
g-garron 2 days ago 0 replies      
I've seen once on Discovery Channel, that Buddha said:
"You will only be happy when you kill all your desires, when you want nothing". It was something like that, so, as soon as you want nothing, you are free, and then you are happy.
Alex3917 3 days ago 0 replies      
As Heraclitus would say, all flows.
andrewcooke 3 days ago 0 replies      
what's the implicit context here? it's hard not to think that this is related to the nym-wars, but how?

(i hope it's not "if we can be happy with who we are, then we will not mind using fixed identities"...)

sausax82 2 days ago 0 replies      
The article resonates with a branch of philosophy called non-duality. Lately I have been reading a lot about non-duality, mainly from books written by J Krishnamurti. It all boils down to living in an ego less state and getting rid of the illusion of choice. It's very easy to understand it intellectually, but problem lies in realizing it in everyday life.
mfceo 1 day ago 0 replies      
its not good to loose ones ego.
donnaware 2 days ago 0 replies      
yes grasshopper, to be nothing is to be everything, to be everything is to be one with the universe. Now snatch the pebble....
kwithl 2 days ago 0 replies      
"many of our weaknesses are actually strengths"

So when you say that you are nothing you mean that you have many strengths, flip the ladder again, this is a circular double helix

Qa8BBatwHxK8Pu 3 days ago 0 replies      
What kind of cult is this?
sdfkdfdfjdfng 2 days ago 0 replies      
So is this proof that money rots your brain? Because this guy is very smart, but he sounds...like a homeless guy in Berkeley. I would know, I've had conversations with them. I mean, I wouldn't want this guy on my board if I was crazy/stupid/motivated enough to ask him for money for a start-up, just on the basis of this slightly alarming essay.

Two things come to mind: PUA theory and having so much money that the boredom literally drives you nuts. So I'm thinking this guy may be headed for cautionary tale territory.

Anyway, rock on, faggots.

Why Arabic is Terrific idlewords.com
489 points by ColinWright  2 days ago   234 comments top 37
cletus 2 days ago  replies      
I'm a native English speaker. I studied a little French and speak (and can read/write) some German (with limited vocabulary). For the English speaker, German grammar is painful enough (3 genders, agreement of nouns and adjectives by case, number and article, separable verbs, etc) but, based on what reading I've done, German is still "Latin lite".

A friend of mine (also native English speaker) lives in Taiwan and has learnt traditional Chinese (rather than simplified, as is now taught and used in China). The character memorization, to me at least, is horrifying. But the grammar is fairly simple (apart from tonal variation in words).

The evolution of languages (linguistics I guess) interests me greatly so first up, thanks for the post.

With all these different language systems, one has to wonder how they evolve.

Several events in history are of particular interest.

The first is what happened to English. English in the 10th century was basically the same language as German (Althochdeutsch or Old High German, to be precise). In 1066, the Normans conquered England, bringing French which became the official and court language of England for several centuries.

This had several important effects:

Firstly, many words migrated to English from French and Latin (although there were Latin words previously). Often the English form was "low brow" whereas the Latin/French version were "high brow".

Secondly, without a central authority enforcing a language standard, the language evolved hugely. Old English for Modern English speakers is basically unreadable. Middle English is mostly comprehensible. This was a massive change.

Along the way, English basically lost the concept of case (apart from pronouns), gender (again, apart from pronouns) and word agreement. Grammar was also greatly simplified (almost everything in English is done on word position rather than word ending like most Indo-European languages).

The TL:DR version of this is that there is a reasonable case to be made that a central authority actually stifles language "innovation". If true, one could argue that how arbitrary (rather than regular) a language is is a measure of the state control over that language during its history.

I don't presume to argue that this is true but it's an interesting idea.

The second interesting historical event was the switch in 1929 from the Arabic alphabet to a Latin alphabet for Turkish, which had a massive increase in literacy in the following years (since the alphabet is simple and phonetic).

The third interesting event is the rise of computers. I would argue that is was almost inevitable that this would happen in a country with either the Latin or Cyrilic alphabets. The reasons are:

1. Limited number of characters (think: keyboards and character sets); and

2. Limited variation in those characters (in English: uppercase and lowercase). Compare that to Arabic.

IMHO (1) is incredibly important. In Mandarin, if I want to tell you a new character I have to show you. There is no way to describe it. In English, a new word can be communicate verbally. I don't think you can overstate how important this distinction is.

Asian computer use has evolved a number of schemes to get around these issues, such as character combinations to represent certain characters (which, again, you have to learn) and, more recently, the use of graphics pads to draw characters.

The last thing I wanted to mention was this paper [1], which shows a mathematical relationship between languages becoming regular and the frequency with which words are used (the more a word is used, the longer it takes to become regular).

Anyway, enough rambling on my part.

[1[: http://www.physorg.com/news111241495.html

donw 2 days ago 3 replies      
Marginally on-topic, but I just looked at the tuition for the Monterey Institute.

A summer program is $15k USD; a full semester is $16k USD.

If you want to learn a language, and have a spare $15K and the time for intensive classes, you're far better off just booking a plane ticket, going to wherever it is, and hiring a private tutor.

Let's say you wanted to learn Chinese. You can live very, very comfortably in Beijing on about $2k USD per month, including the cost of a private tutor, food, entertainment, and travel.

That means six months in-country for less than the tuition of the Institute. Six months of daily use plus intensive study will move you, terms of proficiency, ahead of 90% of people that hold degrees in Chinese.

You could do something like this even in less westerner-friendly countries, and learn a million times more than you will at any language school -- there is absolutely no substitute for actual experience.

bluishgreen 2 days ago 4 replies      
"The combination of numerous dialects and a formal/informal continuum is pretty much unique to Arabic and gives rise to fascinating situations watching Arabs calibrate their language based on the situation and the linguistic background of their interlocutor."

Nope, not unique.

Tamil (http://en.wikipedia.org/wiki/Tamil_language) is the same way. I am a native tamil speaker.

blahedo 2 days ago 0 replies      
To get a sense of the status of Arabic dialect with a referent that might make more sense to a western audience (or maybe not), try this approximation: Arabic now is like Latin a millennium ago.

In the 11th century, standard Church Latin had evolved slightly relative to Cicero but was clearly the same language; someone who was fluent in Church Latin could reliably travel anywhere in Christendom (and to cosmopolitan cities elsewhere) and make themselves understood. The local dialect there might have evolved from Latin, or might not, but there'd be someone educated and literate that they could communicate with. In the areas where the language had evolved from Latin, the hoi polloi could kinda sorta make out the Latin (better in some areas than others), and the cleverer ones could figure out the relationships between their language and Latin, and the educated ones would just go learn Latin (but perhaps have an easy time of it).

Thus also, mutatis mutandis, with Modern Standard Arabic. The analogy isn't perfect but it turns out to be pretty darn good and lets you make some good guesses about the situation on the ground in the Arabic-speaking world and the mutual intelligibility of, say, Qatari and Tunisian (i.e. not very much).

hsmyers 2 days ago 1 reply      
Great article on one of my favorite subjects. In the 80's using an IBM PC with a Hercules graphics card I wrote an Arabic word processor so my partner could write to his family in Farsi. The problems of right to left and differing letter shapes depending on leading, middle and ending position was an interesting challenge---not to mention the problem of vowels that occur above and below the letters. That said, I could have done without the attack on DLI---actually having worked with graduates (speaking Korean and Vietnamese) they spoke as natives including idiomatic expressions. Oh well we all have biases and the rest of the posting was excellent!
gwern 2 days ago  replies      
> The combination of numerous dialects and a formal/informal continuum is pretty much unique to Arabic and gives rise to fascinating situations watching Arabs calibrate their lanugage based on the situation and the linguistic background of their interlocutor.

Not Chinese?

(Also, #9 hardly seems like an item to include in an article explaining 'Why Arabic is Terrific'. 'Terrifying', perhaps.)

onan_barbarian 1 day ago 0 replies      
Thought-provoking, but with some comedy gold scattered through it. I thought I vaguely recognized this guy's sense of humor and/or the blog title, and sure enough:


Definitely worth a read for anyone who missed it the first time. Warning: contains irreverent references to pg.

Jun8 2 days ago 3 replies      
Interesting post but the OP is linguistically naive, which is generally what happens when English speakers encounter another language with more interesting (i.e. exotic") morphological, phonetical, etc. elements. If Arabic if your first foray into a highly inflectional language, it might look fairly exotic. If you had studied, say ancient Greek or Latin, first, it might look less so.

"you find out that the underlying language is pretty vanilla, and meanwhile there is a stack of three thousand flash cards standing in between you and the ability to skim a newspaper."
This comment about Chinese and Japanese (totally unrelated languages with very different writing systems, is laughable. The syntax of Chinese is simple (it pretty much has non at all) but Japanese syntax is complicated.

The OMG moment one faces when faced with thousands or weird Chinese signs is partly an illusion. Yes, you have to memorize a lot of things, like any other languages, but the meaning of a Chinese word can be guessed if you know the determinitve, even f you don't know the word. There are only about 200 determinitives.

An interesting thing about Arabic is that, due to its tie to Koran, it's been studied linguistically since early times. But other languages can also boast such long examined lives, e.g. Sanskrit or Chinese.

damncabbage 2 days ago 2 replies      
This is a great article, but Firefox chokes on the <?xml ... ?> block at the top, and switches to ISO-8859-1 encoding.

Go View -> Character Encoding -> Unicode (UTF-8) to have the Arabic examples render correctly.

csomar 2 days ago 4 replies      
There are a few words that take a regular plural suffix, but most of the time to make a plural you have to change the structure of the word quite dramatically:

The structural change is regular. That is you don't have to learn each name plural.

The Arabic writing system is exotic looking but easy to learn, which is a rare combination. The language uses a straightforward alphabet, but because letters change their shape depending on what their neighbors are it is quite impenetrable to the uninitiated.

I thought the same thing for English and French, no? When I learn English and French in school, we join letters.

Formal Arabic distinguishes between groups composed entirely of women and groups that contain one or more men, and has distinct pronouns, plural forms, and verb conjugations for feminine dual and feminine plural.

The same for french. (Il, Ils, Elle, Elles, On)

What we call Arabic numerals aren't used in Arabic except in extraordinarily formal contexts. Instead, Arabic uses "Indian numerals", which look like this:

Not true. The real Arabic numbers are the ones used today. The Indian numbers were used in the middle east region because of the strong Indian influence.


Well, if you are really interested in Arabic, then what you should learn is the Arabic poetry. It's one of the marvelous human inventions. It's very hacky, and quite hard to write. The wealth of books (old books) and poems written in Arabic are worth learning it.

gmi01 2 days ago 2 replies      
I am a native Arabic speaker, I enjoyed the article, however there are a few mistakes.

In 2. The exceptions are called plural exceptions which happen much less than the general rule, Otherwise most of Arabic follows a very specific rule to making plurals from singulars.

In 7. Adjectives have no gender, and therefore al-kutub hadra' (الكتب حضراء) "The books, she is green" this translates to the books are green (hadra is an adjective and has no gender)

In 9. Formally Arabic numbers are read right to left, i.e. we read the least significant digit first. Although very few people do this.

In 10. It is next to impossible to understand any written text which is a 1000 years by the average Joe, including the Qur'an

rdouble 2 days ago 1 reply      
As a side note to the side note, the "Arabist" tradition in the US State Department has been around since the mid 1800s. Robert Kaplan wrote a great book about it: http://www.amazon.com/Arabists-American-Robert-D-Kaplan/dp/0...

(The OP made it sound like elites studying Arabic for their ambassador posts is something new. It just faded in popularity when religious fundamentalists started to rise in influence.)

vbtemp 2 days ago 4 replies      
Hebrew has virtually the exact same properties, and is quite simple to learn as it is extremely regular, and like all Semitic languages is based off a three-root system. In fact, in a number of conversations with my arab friends I'm stunned just how similar Hebrew and Arabic are (both in good and foul words) and in grammar.
johnyzee 2 days ago 3 replies      
I believe Arabic is the language with the largest variety in sounds. A lot of these sounds don't exist in other languages, such as the deep-throated 'ain and the seemingly many different ways to make an 's' sound, each with subtle differences. A friend of mine said it makes speaking very relaxing as it involves so many parts of the throat and mouth.

I read somewhere once that Arabic is also the language with the largest vocabulary of words - something like three million versus one million for English. Don't know if this is true or not.

glenngillen 2 days ago 1 reply      
"The language of the National Designated Other is bound to switch to Chinese in a couple of years"

I assume this in reference to the predominant 2nd language for most people? If so, something I've always struggled to agree with is the implication that English is the second language for so many people only because of some fortuitous timing on the part of the British Empire. Technological advancements have certainly helped, as has the fact that as the British fell from their peak the US (also English speaking) came to take it's place. But is it really just a matter of timing within a generation or two most people will be expected to be somewhat fluent in Mandarin?

Maybe I'm a bit naive because I was raised speaking English, in an English speaking country, and only fumble my way through a couple of other languages enough to not get entirely lost as travelling. And I can understand the reasons why English can be so difficult for foreigners to learn because, even moving to the UK I discovered entirely new ways to pronounce words I thought I already knew. But... and I think it's a big "but"... something that has always seemed almost unique to English in my limited experience is that you can speak it badly and still be understood. You can speak it really badly, and while people might chuckle at the way you've turned a phrase you'll still get help. When I compare that to my experiences throughout south-east Asia, Italy, France, and Spain nothing further could be from the truth. Inflections or emphasis on the wrong syllable can have drastically different meanings that elicit confused looks or something altogether wrong.

I just don't see Mandarin doing the same. But maybe I discount the importance of the economy as a driving factor too much.

ziyadb 2 days ago 0 replies      
The most interesting part is the deviation of spoken Arabic (which as mentioned in the article, varies by region, e.g. Saudis speak different Arabic from Jordanians, or the Lebanese) from written Arabic. I'm a native Arabic speaker but most of my education and upbringing were in English, so I tend to use the "higher form" i.e. written Arabic, when communicating with fellow speakers. And yes, it is terrific indeed.
patrickas 2 days ago 1 reply      
Very interesting article.

This is the first time I hear the plural : ustaath -> usaatatha, I am a native arabic speaker and we always use asaatitha.

Also an old side project of mine, http://yoolki.com for en-ar transliteration

pknerd 2 days ago 1 reply      
Without having any intention of sounding bias,let's ignore that it belongs to some faith, if Arabic is recited properly then it does sound amazing even if you can't understand it at all. The sounding of Arabic Alphabets is in such a way that which I hardly found in any other language. Even in my own native language which is quite rich. Do listen a bit:


hermanthegerman 2 days ago 1 reply      
Thanks, that post was fantastic. Having the arabic words in larger font size would have been great, though - really hard to look at them and spot the differences in that size (and zooming all the time is annoying).
sethg 1 day ago 0 replies      
As someone who has learned classical Hebrew and Aramaic up to the “if I torture the page for long enough it usually confesses”, I love the root-and-pattern Semitic grammar, up until the point where I have to look a word up in a dictionary, because a lot of root letters actually drop out of the inflected forms, so I have to flip back and forth looking for possible candidate root forms. And prepositions are just tacked onto the beginnings of words, so especially in un-vowelled text, one can confuse a preposition for part of the root. And Talmudic commentaries are often written in a mishmash of Hebrew and Aramaic, because after all, if you can't read both Hebrew and Aramaic, you shouldn't be reading the Talmud in the first place.... Good times.
_corbett 1 day ago 0 replies      
Arabic is terrific and difficult. I studied it for years, including moving twice to the Middle East but in just a few months of study was better at German than at Arabic simply because of the cognates.

It's definitely worth the effort though.

Cyph0n 2 days ago 2 replies      
Yep, Arabic is indeed a fascinating language. I'd like to add that Arabic grammar is extremely difficult, at least when compared to English. It takes years of study just to understand what short vowel to place at the end of certain words.

To put it simply, Arabic sentences are divided into two main types: "verb" sentences (first word is a verb) and "noun" sentences.

Verb sentences are structured as (from right to left of course):

verb > subject > object

Each word ends with a different short vowel depending on its place in a sentence. The subject for instance always ends in a dhamma ('u' sound), while the object ends in a fatha ('a' sound).

Unlike with the English language, rarely do you find someone who follows grammatical rules in everyday speech. Only scholars or teachers who use formal language apply such rules when speaking or writing.

The vastness of the language is also why Muslims resort to a tafsir (or explanation) of the Quran to understand certain verses and chapters. Words sometimes mean something else when found after a certain word or in a certain context.

iamwil 2 days ago 0 replies      
"Some words have separate broken plurals depending on whether you're talking about a small or large number (the cutoff is somewhere around seven)."

Huh, I wonder if this came about naturally, and if it did, I wonder if it has any tie-in to the number of items people can hold in their heads at one time:


drtse4 2 days ago 0 replies      
Great post.

See this http://en.wikipedia.org/wiki/Varieties_of_Arabic Examples of major regional differences) for more info on the differences between regional dialects.

samuel1604 2 days ago 0 replies      
I learnt hebrew at the jewish summer camp and that made a hell lot easier for me whilst learning arabic for my US government contract (field) work...
Mvandenbergh 2 days ago 1 reply      
I agree totally with #1, the root system is very cool.
The diglossia has two sides to it, on the one hand it makes it harder to structure learning because the basic phrases that are the bread and butter of every other language learning program are usually expressed only in the dialects and may not even have agreed upon spelling in the written language. On the other hand, if you just want to speak, you can spend relatively little time on grammatical features like the dual, because many spoken dialects barely use them.

The real barrier for me though has been that Arabic (like Hebrew) is written without the short vowels in all text except for children's books and religious texts. This means that if you don't know a word, you can't pronounce it properly.

bluekeybox 2 days ago 3 replies      
One of my friends (who studied philosophy) and I had a discussion the other day about differences between natural and programming languages. I made a point that my knowledge of Russian in no way helps me speak better English (we both agreed on that), while a C programmer who also knows a language that relies on a different programming paradigm, for example, Lisp or Prolog, is likely to be a more effective C programmer than someone who hasn't been exposed to that paradigm.
bane 2 days ago 0 replies      
Anybody interested in written languages can get lost for hours here http://www.omniglot.com/
dmoney 2 days ago 3 replies      
> For exmaple, here are some "words" consisting of a single letter repeated three times:

> ييي ععع ههه Ù'Ù'Ù' للل

Which for me looks like (asciified):

> USUSUS 010101 UtUtUt UfUfUf UnUnUn

I'm guessing I'm missing a font or something? (Firefox 6)

jinushaun 2 days ago 0 replies      
This is why I love linguistics. Languages are so fascinating. Humans invented so many different ways to speak and write down ideas.
tomtom101 1 day ago 0 replies      
Really great post. I studied Arabic for 15 months at the British Military language school and after working as a "Terp" I went on to run the policy for language support to the British Military. I encourage you to continue with your studies as the more time you spend learning the language the more rewarding it becomes. Once you start to move away from MSA and learn dialects it becomes much easier to speak, but even harder to listen. Good luck and great post.
nivertech 2 days ago 1 reply      
Funny thing, that Ancient North Arabian [1] sounds almost like Hebrew.

[1] http://en.wikipedia.org/wiki/Ancient_North_Arabian

njharman 2 days ago 0 replies      
Whatever author has planned for a career I hope he dumps that and decides to be a writer of some sort. That was a truly superior read.
hackermom 2 days ago 0 replies      
If you're looking for something that is terrific for real, read up on Japanese - its entirely syllabled nature is fascinating.
cefarix 2 days ago 2 replies      
Have you ever wondered why numbers are written backwards in English? Why do we write the highest-order digits first and proceed to the lower-order digits?

Well these numbers originated in Arabic, and since Arabic is written right-to-left, writing numbers in Arabic actually follows a more logical order: start with the lowest-order digit, and move on to the higher-order ones.

Apparently when the concept was brought into Europe, they failed to account for the different writing directions.

coldarchon 2 days ago 4 replies      
The reason why I will never learn Arabic is that "black man" and "slave" use the same word in that language ..
llcoolv 2 days ago 2 replies      
to be honest most of those neat things exist in many other languages - for example all the slavic languages + the more conservative germanic ones have a way root+prefixes/suffixes that plays a similar role. also most of the slavic languages have separate endings for feminine/neuter plural and slovene even has the dual.

Apparently, to a person who only knows english, which is a very inconsistent and mixed language, these features could look extremely neat, but the reality is that they are quite common.

It's Official: HP Kills Off webOS Phones and the TouchPad techcrunch.com
434 points by canistr  5 days ago   305 comments top 45
donw 5 days ago  replies      
Once again, HP proves that the collective vision of Bill Hewlett and Dave Packard is so long-dead that the tombstone has crumbled to dust.

I'm gutted. Genuinely.

When I was younger, before Fiorina stepped up to be the first to rape the corpses of the founders, I was a massive fan of HP. My first graphing calculator was an HP48G, which taught me the joys of Lisp... well, Psil, because it came with Reverse Polish Lisp.

WebOS could have been the resurrection of that culture -- fully JavaScript development environment, app development across multiple mobile and tablet platforms with a single environment... just genius. It's sad to see that vision evaporate, along with some of the novel telephony and interactive features that made WebOS a joy to work with.

martingordon 5 days ago 6 replies      
HP spent $1.2B to acquire Palm a little over a year ago. According to Google Patents, Palm is the assignee in 12,000 patents. If HP can sell off the Palm patents for the same $/patent as Google is paying in their Motorola acquisition, they would net $8.8B.

It's a sad day for innovation when patents are worth more than the product they back. Why wouldn't HP kill off webOS? How long would it have taken them to make $7B in profits off of webOS?

andrewljohnson 5 days ago 1 reply      
Glad we told them we weren't interested in developing apps when they contacted us on April 15. I eventually had a phone call with the fellow who emailed us, and I explained to him HP was doomed. He told me they were sure they could be a strong third!

Here was the original email:


I am part of Hewlett-Packard's Business Development team that focuses on engaging one-on-one in strategic relationships with leading partners for webOS app opportunities.

I wanted to touch base with you to see how things were going with your "APP NAME REDACTED", and also discuss with you our plans for HP's upcoming webOS tablet launch.

If you have some time, it would be great to connect to share where we are headed, and also hear more about your mobile strategy.

Let me know when you'd be available and I can schedule a call- I hope to speak with you soon.

bradly 5 days ago 2 replies      
HP sent me a TouchPad a couple weeks ago and after the first 2 days I haven't touched it. It just didn't seem like a finished product. I got errors and buggy UI interactions regularly. It lacked mature apps for basic usage like reading ebooks. Also, the screen rotation sensor is really sensitive so it almost always needs to have the orientation locked.

A couple things I did like about the TouchPad over the iPad: shift key on the keyboard to access special chars easier and downloading apps without being thrown out of the App Store.

RexRollman 5 days ago  replies      
It is amazing to me that of everyone who has made tablets, only Apple seems to be successful at it. Maybe it is just like someone said recently: there is no tablet market; just an iPad market.
spitfire 5 days ago 4 replies      
Wow. That was incompetent. Talk about ignoring your OODA loop. Who is the new guy running HP these days?

Had HP been smart they would have stayed in the tablet business and aggressively integrated it with their software stack. Letting customers buy HP from end to end, sort of like Apple.

They've just handed Apple a much easier path into the enterprise market - which they don't seem to be wasting.

drgath 5 days ago 4 replies      
I'm astonished that they aren't even going to bother selling it off. With the Google/Motorola merger, surely one of the Android partners has to be feeling a little uncomfortable right now. Hell, Motorola was always rumored to be working on its own JS-based OS because it didn't like the position of relying only on Android.
ansy 5 days ago 0 replies      
It appears this kills any chance of another manufacturer picking up webOS. If HP can't make it work why should anyone else try?

Hindsight is 20/20. Google and Palm could have made a deal. At least then the WebOS team could merge with the ChromeOS team. And Google would have that patent trove for a bargain price. And an in-house manufacturer, too, if that's what it wanted.

Too late for that now. At best Google will get the patents at a significant premium. ChromeOS is too far along to benefit from WebOS. And now Google has Motorola, what would it do with Palm's hardware?

recoiledsnake 5 days ago 4 replies      
Are they going to make Windows 8 tablets?

>HP's wording up above leaves things a bit vague, with at least two potential routes left open: licensing webOS to others

Why would anyone license something that's basically stillborn even with a company as big as HP pushing it? Obvious choices for OEMs seem to be Android or Windows 8.

Sign me up for a $150 Touchpad at the firesale if there is one(resisting temptation to call it the Ouchpad like a headline did).

That's how much I am willing to spend for the latest OS with no future joining the ranks of good-but-dead ones like Amiga and BeOS.

thought_alarm 5 days ago 6 replies      
I wonder what they're going to do with the hundreds of thousands of unsold units. Bury them next to the pile Atari E.T. cartridges, I suppose.
ConstantineXVI 5 days ago 2 replies      
The TouchPad launched on 7/1 in the US; 48 days ago. By a strange coincidence, that's how long it took for the Kin to be cancelled as well.
p0ppe 5 days ago 1 reply      
Ari Jaaksi, HP's SVP in charge of webOS and services had an interesting tweet 40 minutes ago;

"We will continue webOS platform full speed! #webOS"

anigbrowl 5 days ago 0 replies      
I didn't expect my prediction to come true quite that fast: RIM and HP are the big losers here; I can't see any reason to buy into the Blackberry OS or WebOS from either a consumer or business point of view.
blinkingled 5 days ago 0 replies      

Height of vague cluelessness in the non-answers. Scary given the impact of the announcements.

"all outcomes are possible, including a potential non-transaction" [about the PC business spin off ] - Why announce anything right now then?

"How will you make webOS profitable? A> We expect the dev expenses to come down dramatically -- down to 1 or 2 cents per share a quarter. If you look at the run rate losses there, you can attribute them pretty much to Palm" -- i.e. we will make it profitable by killing it!

jabo 5 days ago 0 replies      
Only yesterday I received this email from a HP.com email address.


I am in business development with the HP Touchpad group and we are recruiting apps to be ported to our Touchpad Platform. I saw your app on the chrome store and wanted to reach out to gauge the interest. Since the Touchpad supports both Flash and HTML5, our technical team believes that porting your Scribble and Pixza Lite apps to our platform should be fairly easy. We have support resources, MDF, marketing programs, and loaner devices to assist the effort. I would be happy to do a quick call to explain things further.


revorad 5 days ago 2 replies      
"Since I'd been very young HP had been the highlight company for engineers" - http://www.youtube.com/watch?v=UMRmG72LBU8

I wonder what Woz thinks now. It must be a sad day for many HP engineers.

ww520 5 days ago 3 replies      
Is perseverance a passé in business these days? Everything has to be instant success? What happen to build the product slowly over time?
jemeshsu 5 days ago 3 replies      
HP should open source webOS.
Luyt 4 days ago 0 replies      
esr writes in his latest Smartphone Wars[1] installment:

"WebOS, we hardly knew ye [...] WebOS has looked terminal to us for a long time. [...] WebOS didn't suck, technically speaking. It was certainly better constructed than the turd-with-frosting that is WP7. [...] The cool thing about WebOS was that its architecture was beautiful. [...] WebOS's problem was that the coolness stopped there. The source was closed, with all the usual bad effects including higher defect rates and lower developer interest. [...] Some sort of larger shakeout seems to be going on. [...] RIM is next to the wall, probably. WP7 should already have been terminated for extreme failure (Samsung's own-brand Bada OS is actually outselling it[2]) [...] It might be that Android, Apple, Microsoft, and RIM are now entering scorpions-in-a-bottle time. [...] There can be only one... major incumbent. A whale, with a minnow or two in its shadow. Maybe Android should invert the Twitter fail whale into a success cetacean?"

[1] http://esr.ibiblio.org/?p=3611

[2] http://www.betanews.com/joewilcox/article/Samsungs-Bada-outs...

shaggyfrog 5 days ago 4 replies      
So does Microsoft come a-calling with a bag with a giant $ on it, and suggest coming back to Windows? Does Google dangle Android?

Or does this mean HP is getting out of mobile altogether? If so, that's a huge mistake, given the state of the market right now. Especially if they are spinning off the PC side of things, what's left for HP? Printers?

teyc 5 days ago 0 replies      
HP couldn't solve the chicken vs egg problem. An upstart OS from an established company is still an upstart OS. With MS launching their Windows tablet in the near future, HP will have to run very hard to establish their positions before the MS onslaught.

The journey to iPad started with iPods and iTunes, then iPod touch apps, then iPhone and finally iPad. Steve's brilliant insight is that a music device with a computer is a computer, and outflanked the music industry and Microsoft in one fell swoop.

Even today, MS still can't ship a music store on the PC.

PS What Google did right was to give away the OS early on, and that allowed the Shenzhen manufacturers to produce the iPeds that swamped the low end markets. I don't know if this is going to work out in the long run, but installed bases is important if you want a developer ecosystem.

steveb 5 days ago 1 reply      
Apple's path of disruption continues. I hope they open source WebOS, but they will probably sell it for the patents based on what MMI got from Google.

I wish Nokia would have bought Palm.

Expect the PC market to be disrupted next by Apple with Macbook Airs sucking up all the profits, and iOS devices crimping unit growth. Who is going to stick their neck out to make single-digit margins on a Windows 8 tablet?

voyou 5 days ago 0 replies      
They're also selling off their PC division, and buying Autonomy (http://www.autonomy.com/), who produce search and data-mining type software. I assume this is an attempt to re-focus on enterprise back-end stuff; this is a pretty big part of their business already, so I can see how this move might make sense.
nQuo 5 days ago 0 replies      
I feel that one of the main problems with webOS is that many people like it a lot for the design and want to see its potential unleashed.

Yet users don't vote for webOS with their wallets and developers don't make webOS apps, because many question whether it can be a sustainable mobile platform. Another reason is that the hardware running webOS has always been a bit of a let-down.

On a side note, I can't help but wonder if Steve Jobs feels a little saddened to see HP's lack of vision and commitment nowadays to make great products. Yes, HP's obviously a major competitor but mostly in the PC industry where it makes much slimmer margins. Jobs' first summer job was at Hewlett-Packard and also where he met Steve Wozniak, so it's more of a personal sting to witness the once iconic company giving up on post-PC devices.

dman 5 days ago 0 replies      
This is karma coming back to bite Palm for doing nothing with the BeOS rights.
jp 5 days ago 0 replies      
I think the PRE2 is a great device. Google Apps email, SMS texting and Google Maps is fantastic. Browser works great. The clever status light, the window swiping, the bottom gesture panel logic and the clever window bundling makes Android and Bada feel completely retarded. Very good over the air software updater. Pointing that out since Bada is very dumb in that department and requires a huge iTunes like application. The HP CEO is either retarded or scared.

Like... I got an HP laptop and an HP phone... and suddenly this is not cool enough for HP because of some premature, rushed to marked, tablets ?

Disclosure: I have received a few developer phones over the years.

saturdaysaint 5 days ago 2 replies      
I wonder if WebOS will be attractive to any of the handset makers that harbor suspicions about Google/Motorolla - HTC? Samsung (call it Bada 2.0?)? Sony?


It strikes me that Apple could make a lot of money by making cheaper (sub $1,000) MacBooks now that the competition is running scared. If they could sell a 2010 white Macbook for $900 (which I presume would still have a respectable margin), you wouldn't see a PC on a college campus and they'd even make a compelling case for a bigger enterprise push..

taylorbuley 5 days ago 1 reply      
HP just created itself the perfect reason to open source WebOS and throw a disruption cocktail at Google and Apple
emehrkay 5 days ago 2 replies      
I still want someone to couple webOS with great hardware and dev tools.
A-K 5 days ago 2 replies      
Tragic. It makes me wonder: Now what is Jon Rubinstein's future at HP?
mathoda 5 days ago 0 replies      
If you're an HP engineer working on webOS, given the commitment level your execs have to that platform, why wouldn't you jump ship? Startups, Apple, Google, Facebook, even Microsoft, may benefit in terms of human talent.
dgregd 5 days ago 3 replies      
One thing I do not understand. These HP and others CEOs earn millions monthly. So they are not stupid.

How many desktop _platforms_ survived? Two. Windows and MacOS by miracle.

So why they think that the world needs more than 2 or 3 mobile _platforms_? Why they think is it possible to beat Microsoft / Nokia / RIM / Samsung with already collapsed company? Why they bought Palm? What was the thinking?

Or maybe they are personally making money with acquisition like this. Similar to CEOs from banking when they led whole financial sector to the cliff.

Apocryphon 5 days ago 1 reply      
And then there were 2 and a half.
soapdog 5 days ago 1 reply      
wtf?!?!?!? HP Pre 3 just launched on Palm Europe??? this makes no sense
azth 5 days ago 1 reply      
From http://www.bbc.co.uk/news/business-14584428

"HP is recognising what the world has recognised, which is hardware in terms of consumers is not a huge growth business anymore," said Michael Yoshikami, chief executive of YCMNET Advisors.

"It's not where the money is. It's in keeping with the new CEO's perspective that they want to be more in services and more business-oriented."

jsz0 5 days ago 1 reply      
Is this a practical example of how Google's strategy with Android is just a bit anti-competitive? Neither Palm or HP could have seriously considered licensing WebOS to compete against Android because neither has a gigantic advertising business to subsidize the cost of development. On the flip side of that how is Palm or HP going to compete with companies like Samsung, HTC, Motorola, etc who are getting their OS for free? If they skip the costs to develop their own OS and go with Android what differentiates an HP tablet from a Samsung tablet if they're running the same OS? All roads lead back to Android. You either use it or you can't compete against it.
watmough 5 days ago 0 replies      
This is really sad. I would have liked to have seen webOS carve out a decent niche against Android and iOS, especially seeing that application development could be largely JavaScript-based with C++ for the high-performance parts.

Commiserations to all the people that worked on webOS.

rbanffy 5 days ago 2 replies      
Apotheker behaved in a truly managerish way...
protomyth 5 days ago 1 reply      
This is one of those times I wish Red Hat (or another Linux company) had the money to buy WebOS from HP. It would make a very interesting standard UI.
barista 5 days ago 0 replies      
Inevitable decision. It's really hard to do complete stack unless you are completely comitted to it like aaple is. It was always going to be an uphill battle for HP and I am glad they realized this sooner than later. I said this before. Nokia did a smart thing by adopting a third party software and leveraging their position in hardware to get a good bargain in return.
nnutter 5 days ago 0 replies      
WebOS was so close to being amazing. I can only hope they pull off something amazing with HTC or Samsung making the hardware.

On the other hand if Apple makes a 7" tablet I won't even have a major reason to look elsewhere.

checoivan 5 days ago 2 replies      
I don't think HP is throwing the towel, just not putting all eggs in a single basket. They know now how big a challenge is to turn it around on the tablet market. Cloud's the thing now so they're also getting into it.
hollerith 5 days ago 1 reply      
Does anybody here think that Microsoft will publicly concede defeat in the smartphone business like HP did today at any time in the next 4 years?
westajay 4 days ago 0 replies      
This reminds me of Gerstner's OS/2 move with IBM in the early 90's.

Does anyone have insight into why Palm (pre & post HP) struggled to get the hardware right and remove glitchy behaviour from the OS?

jamieb 4 days ago 0 replies      
Some companies don't seem to have caught on that we are in a recession. People don't have the money to just go out and buy the latest cool gadget. No more one-upping friends. No more trying out the latest toy. In a bygone age HP Touchwhatsits would be littering Hummer H2 dashboards everywhere. Not so today.
Please confirm your email address bvckup.tumblr.com
417 points by huhtenberg  5 days ago   79 comments top 14
MatthewPhillips 5 days ago 3 replies      
These are the types of stories I like to see on the front page of HN.
starpilot 4 days ago 0 replies      
Using the imperative wording "please confirm" suggests the recipient must take an action, rather than passively receive information. This reminds me of a Mac OS X Human Interface Guideline:

"Use a verb or verb phrase for the title of a push button. The title you create should describe the action the button performs"Save, Close, Print, Delete, Change Password, and so on."


The difference between "Email confirmation" and "Please confirm your email" is like that between "Ok" and "Save."

latitude 5 days ago 9 replies      
That's my post there. It'd be very interesting to compare email confirmation rates. I am running on a private mail server that is configured according to all possible spam filtering guidelines and yet I am still at only 82%. I wonder if these 18% is just people overwhelmed with emails in their Inbox or if there's still some issues with being mis-categorized as a spam.

Anyone else cares to share their numbers?

rickmb 4 days ago 1 reply      
I remember a talk by Terry Chay in which he showed that adding a smiley to the subject (don't remember the project) dramatically increased the response.

Little details can make such a big difference.

acangiano 5 days ago 2 replies      
For AnyNewBooks.com I have the following thank you page: http://anynewbooks.com/thank-you/

And the email I send has the subject: "Please confirm your subscription"

Between the two, the message is fairly clear. Yet, I still receive emails from novice users once in a while, who aren't familiar with the concept of "confirming" a subscription. Usability isn't easy.

sudonim 4 days ago 0 replies      
We use the Devise gem https://github.com/plataformatec/devise and the default subject line is: "Confirmation instructions". I can't say anyone on our team thought to change it. Good clear language like "Please confirm your email address" would be great to have as the default in gems like devise
EGreg 4 days ago 0 replies      
We had a cool thing. We let the user use our site once, but to "activate their account" and set up a password, they had to click the link in their email. Otherwise they would be reminded next time they tried to sign in (without a password)

It gave people an incentive to verify their email!

Check out blurts.com and qbix.com for examples

duck 4 days ago 0 replies      
I think it totally depends on your subscriber base, but with Hacker Newsletter I've seen about a 95% success rate. I have always used the default MailChimp subject which is "Hacker Newsletter List: Please Confirm Subscription".
resdirector 4 days ago 0 replies      
I use usertesting.com. You get a ten minute video and audio of testers using your site. These types of small (aka big) problems are weeded out pretty quickly.

Can't rate them high enough.

joshu 4 days ago 1 reply      
This is good. I wish more people would talk about email praxis because it seems like voodoo a lot of the time.

Do we even really need to confirm email addresses?

pkamb 4 days ago 0 replies      
This always bothered me about SEOmoz's new public analytics button on all their blog updates.

The button says "Post Analytics". That makes me think "send analytic to SEOmoz". But what they really mean is "view this blog post's analytics".

cpeterso 5 days ago 3 replies      
You might also consider adding a web bug image to your confirmation email. If the user loads the confirmation email's HTML, you know it is a real email address even if they don't click the confirmation link.

Admittedly, it might be someone ELSE'S email address, but they could manually unsubscribe from your mailing list later.

trustfundbaby 4 days ago 0 replies      
I bet the one with the title "Email confirmation" was going to spam folders ... I've seen a lot of spam with that exact title when I'm cleaning out my gmail spam folder
Tilleul 3 days ago 0 replies      
78 comments so far, and not one mention of technical writing. I have to admit that I almost never hear about hiring or contracting with tech writers in early-stage startups. But it might be a worthwhile idea.

I'm a tech writer, and I hope that not too much self-advertising for my first post here.

Your problem with Vim is that you don't grok vi stackoverflow.com
388 points by ez77  1 day ago   179 comments top 20
dionidium 1 day ago 3 replies      
One thing I think a lot of the naysayers here are missing is that you don't sit down and read a book on vim -- or even this stackoverflow answer -- and immediately expect to begin using all you've learned. It takes time. And it's painful at first. But it's a lot like gripping a tennis racket correctly or properly positioning your fingers at a piano. Your default method is indubitably the most comfortable -- by definition -- but the payoff of attempting the more difficult path can be likened to the magic of compound interest.
eftpotrm 1 day ago  replies      
Flame suit on, but I really don't understand the appeal.

I have reams of arcane bits of information to commit to memory already. .Net in two languages plus VB6 for legacy code. SQL in more dialects than I care to list, each with their own quirks and limitations. HTML and JavaScript. XML and various associated technologies like XPath. That's just professionally; once I go home I can add in photography and music straight off with others at different levels.

There's only so many things one can usefully commit to memory (edit - yes, I meant am prepared to dedicate the time to committing to memory); odd bits are just easier to look up. So why, exactly, should I voluntarily use a text editor whose interface was already far behind the state of the art when I started school?

:q!. I've had others try to persuade me of the joys and power of vi; I'm not even slightly convinced and that is the one command I'm committing to memory.

JulianMorrison 1 day ago 6 replies      
My problem with Vim is that it can't be configured to validate and correctly complete code in realtime (except by mad-science brain grafts like eclim). Note that ctags style word completion is like 1% of enough. The important thing is that code is either visibly wrong (red squigglies) or visibly syntactically valid (no red squigglies). That massively cuts down the contribution of trivial typos to bug hunting. And to do it requires a full parse and analysis of the language being edited, going on in the background as you type.

Without that, Vim is reduced to a quick hacks editor, or an editor for languages where no realtime parse is possible.

Triumvark 1 day ago 3 replies      
> I've often done things iteratively and interactively that could probably have been done more efficiently if I'd taken the time to think out the correct incantation.

Maybe not. Efficiency is a measure of the time it takes accomplish something. 'Puzzling out how' can sometimes be a more relevant bottleneck than 'number of keystrokes.'

freddealmeida 1 day ago 3 replies      
Interestingly emacs seems to be popular in Japan (where I'm based) but I always preferred Vim. Though it is rather complex.

Interestingly if you want to do any Japanese text editing, most IDE's or editors fail horribly. I almost always find myself falling back to Vim.

Mind you, I really like textmate. Sadly, it sucks at Japanese. And will for the time being.

hackermom 1 day ago 1 reply      
I think most people's problem with vim and vi is that they just don't feel like crossing the river to get their water.
scarmig 1 day ago 6 replies      
I wonder how much kvetching could have been avoided simply by making Vim open up in insert mode.
swah 1 day ago 1 reply      
I wonder if the set of folks discussing here is completely different from the set of folks that saw Notch (Minecraft creator) coding on the weekend. Everyone seemed to agree he was very productive developing his game inside Eclipse.
jodoherty 1 day ago 3 replies      
This is one of the reasons I wish all standard text controls came with an optional VI mode.
CmdrKrool 1 day ago 3 replies      
A thing that nobody ever seems to mention which unsettles me far more than learning key commands or modal behaviour in my brief tryouts with Vim is how its cursor visually highlights a current character rather than a point between characters like every other editor I've ever used.

Say I have a line as follows:


In other editors:

  to delete "abc" from the left:
put the cursor on the a
C-d C-d C-d

to delete "abc" from the right:
put the cursor on the character after the c (the d)
C-h C-h C-h

to delete "def" from the left:
put the cursor on the d
C-d C-d C-d

to delete "def" from the right:
put the cursor on the character after the f (the EOL)
C-h C-h C-h

Symmetrical and consistent, to my mind.

In Vim:

  to delete "abc" from the left:
put the cursor before the a

to delete "abc" from the right:
put the cursor on the character after the c (the d)

to delete "def" from the left:
put the cursor on the d

to delete "def" from the right:
put the cursor on the f

The last case there is exceptional. It's a small thing, but not being able to put the cursor on a real or virtual 'end-of-line' character in Vim makes me feel constricted.

I think that the between chars cursor model is simpler because you then have two choices: act on the preceding chars (thus, delete with C-h or <Backspace>) or act on the succeeding chars (thus, delete with C-d or <Delete>). By contrast Vim is more complicated because from the current character you have three choices: act on the preceding chars (thus, delete with X), act on the current char (thus, delete with x), and the principle of 'completeness' suggests a third: act on the succeeding chars - which AFAIK is not available in Vim and so it feels lop-sided to me.

Ironically I think that it's my 'programmer head' which makes the between chars model appeal to me more, as I think of the file as a bunch of bytes (well, maybe multibyte characters) and the text editor as a glorified hex editor, and I just want to choose a position and insert or delete chars regardless of whether they're alphanumeric, LFs, or whatever. Whereas Vim's model of manipulating words, lines and sentences suggests to me a fit with people writing in human languages. Why, non-geeks should love Vim, perhaps except for all the key commands to learn; one imagines an alternate universe where the common keyboard evolved with two Return keys instead of one, labelled "Open new line above" and "Open new line below", and other Vim-inspired niceties - and I wonder whether your archetypal 'grandmother' might find it easier to write letters on the computer in that world than her current futzing about in Microsoft Word.

binaryjohn 1 day ago 1 reply      
It's great to see all the vimmers on this thread. One question, WHERE are all you guys? I work at a very large corp. and I am hard pressed to find another vimmer when it comes to discussing the beauty/elegance/purity of vi at the water cooler. May be we need a secret handshake or nod?
s00pcan 1 day ago 1 reply      
Is it a good or bad sign that I'm starting to see a lot of stories repeat on here? I installed Ubuntu on my work machine last friday and I was just looking at this earlier this morning (I had saved this the last time it was posted).
methodin 1 day ago 0 replies      
It took me far too long to really start using vim. I have to say, though, that as an avid user now I find it really annoying to find dw yy p :q and :w in the other editors I have to use on other machines. It does take discipline, though, but it is really FUN. How awesome is it to find some really sweet new key combination you didn't know existed? I almost never seek out the obscure settings of newer IDEs yet I often randomly search out new key combinations for vim. Love it.
scelerat 1 day ago 0 replies      
I have been enjoying using Vico for the past month or so, a pretty Mac face on vi, with extras like support for Textmate Bundles. The mac-ness of the interface helped me through some vi adjustment rough spots, and now I'm extremely comfortable doing almost everything the vi way.
JoshTriplett 1 day ago 0 replies      
This article captures the one thing about vim that took me the longest to "get", and which proved the most powerful once I did: vim has a rich set of movement commands, and a pile of commands which operate on those movement commands. Putting the two together gives a huge set of useful commands with a simple two-part form.
snorkel 1 day ago 3 replies      
Yes, I get it: some hackers of lore kept adding magic commands to the unix line editor until it grew into a stand-alone editor. Still doesn't appeal to me. Too many arcane keystrokes to do simple edits.
dramaticus3 1 day ago 2 replies      
> A sampling of more advanced tricks

I don't do tricks.

sabat 1 day ago 0 replies      
Actually it's just that I'm not a fan of editor modes.
Tloewald 1 day ago 6 replies      
Good post, and worth reading if you're stuck with vim and don't like it.

But, reading this reminds me of how much I like GUIs. Yup these movements are great. I get the appeal of turning selecting text into a dumb programming trick, I'm sure it makes you feel very clever and productive. Of course I can do all this stuff and more without even consciously thinking about it using a mouse.

Go do multi file grep in BBedit in front of an emacs jock one day. Look what I can do by finding a menu item!

Douglas Adams used to talk about how powerful formatting features in word processors were a great way to procrastinate.

funkah 1 day ago 2 replies      
Why should I have to grok a text editor? Up, down, left and right should get me around and if they don't, I'm not the one who's wrong.

"You guys just don't understand why I eat dirt. I'm on a higher plane of existence and you just don't get it."

The Secret of the Fibonacci Sequence in Trees amnh.org
337 points by pigbucket  4 days ago   26 comments top 12
tripzilch 4 days ago 2 replies      
> Scientists and naturalists have discovered the Fibonacci sequence appearing in many forms in nature, such as the shape of nautilus shells, the seeds of sunflowers, falcon flight patterns and galaxies flying through space. What's more mysterious is that the "divine" number equals your height divided by the height of your torso, and even weirder, the ratio of female bees to male bees in a typical hive! (Livio)

Except that most of this is simply not true:

It's a very tasty popular myth that people like to repeat, that there's a magical sacred golden constant producing all the complexity in nature and more.

Except that nobody actually bothers to measure anything, they just keep repeating and reposting the same images of spiral galaxies and nautilus shells.

Nor is there anything "inherently beautiful" about the golden ratio, research into perceived aesthetics of ratios simply showed that people prefer fractions of small numbers. It's imprecise enough that you really can't say whether people like 1.5 (3/2) or 1.667 (5/3) or 1.618 (phi) best.

The one thing where he is right, is the pattern in sunflower seeds. If you divide the 360 degrees of a circle in two parts so that their ratio is 1:1.618, and you use that angle (about 137.5 degrees) to rotate outwards as a spiral, put a big dot at every point, you'll get a pattern that looks pretty much exactly like sunflower seeds.

The thing about this particular pattern is that the seeds end up being rather uniformly spaced over the plane, while using other angular ratios creates swirly patterns and waves of filled and empty regions.

So I can imagine if you apply this to the rotation of tree branches, it'll result in a more uniformly distributed pattern, that will capture sunlight more efficiently than a pattern with holes in it.

I kind of wonder, though, if it's not the other way around--because nature uses golden ratio angles in tree branches, the fibonacci numbers pop up. Because really it's super easy for fibonacci numbers to pop up anywhere, especially the small ones, what's significant, however, is when the golden ratio actually plays a meaningful role.

extension 4 days ago 1 reply      
I'm not sure how much of this the kid actually discovered on his own. The Wikipedia page on Phyllotaxis cites plenty of past research on why the Fibonacci sequence shows up (and the kid oddly hand copied the illustrations from that page).

It's an emergent pattern from the branches shoving each other around as they grow. It minimizes the overlap of the leaves if they are being added indefinitely. If you know in advance how many leaves/panels there will be then obviously you can just space them evenly. If you ran that experiment with one tree of evenly spaced/angled panels and one tree of golden angle spaced panels, I think the evenly spaced one would win.

palish 4 days ago 2 replies      
I wish Aidan had been allowed to write this in his own words, rather than his parent's / someone else's words.

On the other hand, whoever's taking care of him behind the scenes has done an incredible job. I'd even say Aidan's "set for life"; that might seem over the top, but consider... this link will forever be associated with his name. It demonstrates that even at age 13, he was a very capable real-world problem solver, while also showing off his ability to perform and present his own original research in ways that other people can build on.

That's going to impress virtually everyone he ever meets, probably. Admissions boards, employers, investors, etc. Obviously that assumes he plays his cards correctly going forward. Still, though... this will always be a future de-facto "get-his-foot-in-the-door" for him, regardless of whatever it is he's trying to do. Except maybe pickup chicks.

I just hope he doesn't become a victim of his own success. Hearing "you're such a genius!" from everyone around him would not be good for his future self.

ColinWright 4 days ago 0 replies      
This is by far and away the better article. Such a shame the discussion is on the totally crap repackaging of it:


SimHacker 4 days ago 0 replies      
This is a question that fascinated Alan Turing, who wrote a classic paper called "The Chemical Basis of Morphogenesis" and other unfinished papers about the subject (some published in a book called "Morphogenesis"). He used lots of heavy math that came so naturally to him, to model plant growth as a reaction-diffusion system running in a ring of cells (the stem of the plant). By computing the reactions by hand on paper, he studied how cells could grow into "parastichy" with spiral patterns related by Fibbonachi numbers. http://botanydictionary.org/parastichy.html
hackermom 4 days ago 1 reply      
Here's a little something that most people don't know, that I picked up from my architecturally interested father long ago:

The French architect Le Corbusier (http://en.wikipedia.org/wiki/Le_Corbusier) made use of Fibonacci sequences to create his famous "Modulor" (http://www.apprendre-en-ligne.net/blog/images/architecture/m... - "A harmonic measure to the human scale, universally applicable to architecture and mechanics.") which represents a few fixed points in Fibonacci sequences that have been in use in architecture, interior decoration, carpentry etc. for more than 50 years, at least here in Europe - I have no idea if these scales are as rigorously followed in the Americas or in Asia.

If you look at the picture, and then look at the height of the seat of your kitchen chairs, your kitchen table, your kitchen sink, your cupboards etc., you will find that their tops, bottoms and heights almost always align around numbers in these scales. These measurements create a strange sense of harmony in the way the mind processes geometry picked up from eyesight, which is not perceivable as soon as you move away from these dimensions, in some way quite similar to how the Golden Ratio pleases the eye.

Just for fun I measured some of the interior in my home. Desk: 69cm. Kitchen chairs and kitchen table: 43cm and 70cm. Kitchen sink: 88cm. Bottom and top of wall-mounted kitchen cupboards: 138cm, 225cm (height of 87cm).

Also interesting to note is that similar scales have been found to be used in ancient times as well - seems we took notice of this particular natural pattern long ago.

nvictor 4 days ago 0 replies      
wow people! slightly off topic but that's how you DELIVER information. no ads bullshit, straight, references...

now compare that to the first link we got.

lukesandberg 4 days ago 1 reply      
In wolframs "A new Kind of Science" there is a long discussion of not only the fact that leaf arrangement tends to use the golden ratio and that it is optimal for the plant. But also he describes a model (using cellular automata) that explains why such a pattern might emerge naturally. Unfortunately i don't remember all the details but it was a compelling use case for why automata might be a good model for natural phenomenon.
thebootstrapper 4 days ago 1 reply      
Brilliant. Perhaps first time I'm seeing some one using Fibonacci for other than learning programming language ;-)
Daniel_Newby 4 days ago 0 replies      
Trees also optimize for shading their competitors and avoiding being shaded, not just for efficiently gathering raw light. Understanding the shading factor would require extensive field work and Monte Carlo analysis.
whileonebegin 4 days ago 0 replies      
This reminded me of the PBS NOVA episode about how the Mandelbrot set can describe nature, like the spacing of trees in a forest, the spacing of branches on a tree, the spacing of leaves on a branch, the spacing of veins in the leaf, etc. It's not just random.

Apparently, the fibonacci sequence can be found within the Mandelbrot set, which makes sense from the author's discovery.


ck2 4 days ago 1 reply      
Just say fractals. We already knew they appear everywhere in nature.
If PHP Were British addedbytes.com
330 points by shdon  3 days ago   102 comments top 19
wbhart 3 days ago 2 replies      
If it were truly British then upon an exception it would apologise excessively and offer to dust you off.

Queues would also be ubiquitous. Data could not be accessed in order, but would have to be retrieved from an unordered queue, which would necessarily involve a long wait.

There would also be a congestion charge for drivers at certain hours of the day and interfacing with Rails would cause unexpected delays.

halostatue 3 days ago 2 replies      
Funny, but completely ignorant of the fact that Canadian English is distinct from Amrican English. We generally spell it 'colour' here, but draw the line at 'connexion', and have two spellings for 'seriali[sz]e', depending on the audience and style guide you follow. Joe Clark has a wonderful book about this, Organizing Our Marvellous Neighbours (http://en-ca.org/).
corin_ 3 days ago 6 replies      
I find myself getting genuinely annoyed when having to write things like "color" in CSS.

I wonder, though, as fun as it is to think about these jokes, what serious implications it had. Let's say, for example, it actually had been started with variables using a £ rather than $, would it have made any difference at all?

BerislavLopac 3 days ago 0 replies      
I'm amazed that no comment has yet mentioned my favo(u)rite take on the subject, the immortal Lingua::Romana::Perligata http://www.csse.monash.edu.au/~damian/papers/HTML/Perligata.... ;)
tarkin2 3 days ago 2 replies      
$ is superior though, simply for its ease of access using the left index finger.

£ would require the use of the middle finger.

To British sensibilities, it would mean swearing at PHP on every variable declaration! Actually, that said...

j_col 3 days ago 1 reply      
Brilliant! Given my sadness today around HP destroying the mobile platform I love (webOS), this has really cheered me up ;-) I especially like the cheerio() function.
uriel 3 days ago 0 replies      
And if PHP was Japanese it would commit Seppuku.
lubutu 3 days ago 8 replies      
The only language I know which actually uses "colour" in its standard library, is Occam, designed in Bristol. Are there any others?

(Nitpick: "socialize" is the original spelling; "socialise" was a change in spelling on our side of the pond.)

premchai21 3 days ago 5 replies      
preg_match might actually be better expanded as practical_extraction_and_reporting_language_regular_expression_match. (I'd comment on the original but it's currently too overloaded for it, I gather.)
fdb 2 days ago 0 replies      
He does have a point with the abbreviations.

The use of abbreviations and underscores in PHP is maddeningly inconsistent.



georgieporgie 3 days ago 0 replies      
I always thought the dollar sign originated as a variable marker because it was used in BASIC to mark a string. Is there an earlier origin?
reinhardt 3 days ago 0 replies      
How long until some bored hacker writes a british<->us PHP compiler and posts it here? I give it a week.
thelovelyfish 3 days ago 1 reply      
If it were British there would be alcohol involved somehow.
cafard 3 days ago 0 replies      
If we change the sigils, can we refer to objects as "quiddities"?
jordinl 3 days ago 0 replies      
A colleague once complained because I wrote 'initialize' with a z...
zandorg 3 days ago 1 reply      
In Common Lisp, 'Fourth' is the fourth item in a list, rather than 'Forth'.
4J7z0Fgt63dTZbs 2 days ago 0 replies      
Now if PHP were Japanese?
PedroCandeias 3 days ago 0 replies      
Joke about php devs not being savvy enough to actually go and make these changes in 3... 2... 1...
"Any time you have worked long hours, it is a sign of a broken process." stackexchange.com
316 points by zzzeek  5 days ago   162 comments top 39
edw519 5 days ago  replies      
What do long hours often represent?

Enterprise: Incompetent management, lazy co-workers, and spoiled users.

Small Business: Tough competition and limited resources.

Startup: Taking advantage of opportunites that may not pass this way again.

akeefer 5 days ago 0 replies      
One thing my company has done since way back in its startup days (we're pretty successful at this point) is put an emphasis on working reasonable hours, and it's been successful for us. Some people still choose to work long hours because they're excited about what they're working on, but it's not expected or asked of anyone. There were two reasons for doing that even as a startup, and I think they're both still valid.

Reason #1 is that working long hours often becomes an excuse to not prioritize properly. Working under realistic constraints forces you to really decide whether some feature is worth it, or if spending 40 hours on Feature A is better than spending 40 hours in Features B, C, and D combined. Too often the answer at companies is, "Well, A, B, C, and D are all really important, so just work harder and do everything." That's a very seductive trap to fall into, but it's absolutely the wrong escape valve. At least in my view a failure to focus and prioritize properly is far more often a cause of failure for startups than "we didn't work hard enough."

Reason #2 is that you want to avoid burning people out if you expect to be around for the long haul. Our company just turned 10, and we still have a surprising number of long-tenured engineers, which I'd attribute in large part to the work environment and the relative sanity of the work/life balance people can have. If you expect people to work 60+ hours a week every week, they're not going to stick around for 10 years; they're going to get burned out and bored and they'll feel like the only way to get a break is to quit.

You can quibble with the second reason, but I think that even in a situation where you feel like you have to get a ton done and working 40 hours a week isn't an option, it's very important not to use "we'll just work harder" as an excuse to avoid making the hard decisions around priorities.

felipemnoa 5 days ago 4 replies      
What a bunch of B.S. In fact I cannot believe it is even here in Hacker News.

"Any time you have worked long hours, it is a sign of a broken process."

This is a such a horrible generalization. Successful people always work hard/long hours to make something succeed. Imagine telling your kids that to be successful you should work just 40 hours and no more. They will be easily steamed rolled by other kids that are willing to work harder/ go the extra mile.

Here is a relevant piece from: http://www.paulgraham.com/hamming.html

>>Now for the matter of drive. You observe that most great scientists have tremendous drive. I worked for ten years with John Tukey at Bell Labs. He had tremendous drive. One day about three or four years after I joined, I discovered that John Tukey was slightly younger than I was. John was a genius and I clearly was not. Well I went storming into Bode's office and said, ``How can anybody my age know as much as John Tukey does?'' He leaned back in his chair, put his hands behind his head, grinned slightly, and said, ``You would be surprised Hamming, how much you would know if you worked as hard as he did that many years.'' I simply slunk out of the office!<<

>>What Bode was saying was this: ``Knowledge and productivity are like compound interest.'' Given two people of approximately the same ability and one person who works ten percent more than the other, the latter will more than twice outproduce the former. The more you know, the more you learn; the more you learn, the more you can do; the more you can do, the more the opportunity - it is very much like compound interest. I don't want to give you a rate, but it is a very high rate. Given two people with exactly the same ability, the one person who manages day in and day out to get in one more hour of thinking will be tremendously more productive over a lifetime. I took Bode's remark to heart; I spent a good deal more of my time for some years trying to work a bit harder and I found, in fact, I could get more work done. I don't like to say it in front of my wife, but I did sort of neglect her sometimes; I needed to study. You have to neglect things if you intend to get what you want done. There's no question about this.<<

Yes, sometimes it may mean that working long hours there is something wrong and the title should reflect that rather than just generalizing.

I remember there was a study done at one point that the best piano players had worked longer hours per week practicing as opposed to the so so piano player.

You want to work 40 hours and be happy? Good! But I doubt you will be able to achieve greatness like that. Achieving success requires sacrifices.

Edison is another example of a guy that would work really long hours. Look at everything that he accomplished. You want to be mediocre, work 40 hours. You want to be great like Edison, work your ass off. Don't listen to the little people that tell you not to work your ass off. That is the road to mediocrity.

Now, if you are saying that you want to have time for family and be another cog in the machine, 40 hours are great for you.

edit - OK, after further reflection I think that what the title means is that IF you are just a cog in the machine of a large corporation AND you are working long hours then something is terribly wrong. If that was the original intent then I completely agree. Is OK to do it once in a while but if it is normal then something is terribly wrong.

Now, for academics, athletics, other competitive fields and even startups at least in their earlier faces you still have to work long hours or the other guys will steam roll you. Eventually though you do hit a point of diminishing returns so you have to watch for that.

mburney 5 days ago 5 replies      
I'm curious how efficient most coders are even in an 8 hour work day. I find that I can only log about 4 - 5 hours a day (on average) of solid coding time (or marketing/business work). This is because I limit myself to an 8 hour work day, but of course there are breaks and inevitable down time.
Androsynth 5 days ago 1 reply      
My biggest problem with long hours (aside from the long hours themselves), is that like most programmers, I'm not constantly productive for 10 hours a day. It frustrates me that I have to sit in my desk for long periods where I'm unproductive before my muse hits and then I enter hugely productive periods (which often take place outside of normal business hours).

I don't think some m-f, 9-5 union type situation is the proper answer, but theres gotta be a better way. Sitting at my desk when I'm not being productive is a waste of my life.

rdouble 5 days ago 2 replies      
Long hours are not always because of a broken process, or death march deadlines.

Many times when I have worked long hours, it was because I was really into the problem I was trying to solve, and didn't want to quit.

jgilliam 5 days ago 2 replies      
Ah, so this is why it takes LinkedIn forever to add new features.
gte910h 5 days ago 1 reply      
Routine long hours are sign of a broken process.

Rare spats of long hours due to abnormal events is not an issue

sliverstorm 5 days ago 1 reply      
How about in a cyclical project start -> project release process?

I am not referring to "crunch time" where you realize everything is broken and your schedule was unrealistic; rather, many projects I have participated in have an escalating work load as you near release, because a lot of the work simply cannot be done before previous stages are completed.

Unless you are a large entity that can heavily "pipeline" by running 10 or 20 projects at once, and shuffle people around as a project's workload changes, to avoid ever working overtime you'd need to either have to hire too many employees, or hire/fire regularly. Not entirely dissimilar to trying to balance a server cluster with load spikes.

dave_sullivan 5 days ago 5 replies      
I know a lot of people who seem to like working long hours. It allows them to think they're getting more done, but I suspect that many of the processes involved could be optimized if they thought about it a bit. For me, I work long hours but it's spread out over the course of a day (so removing breaks/bs it's probably not much more than 40 hours p/wk), and I love what I do, so I do get a lot done (and it feels less like work than other jobs I've had).

For other jobs, like being a big firm lawyer, long hours are kind of baked in: You get paid a salary (a big one), the company you work for bills you out by the hour, person with the highest billables doesn't get fired. That's probably not likely to change, it's not a process problem per se, and I suspect there's plenty other jobs generally like that.

AlexC04 5 days ago 0 replies      
Honestly I think the real answer is "if you're asking that question, it's time to find a new job"

I've been through this very same ringer recently and strongly believe that changing from within is far more trouble, far more stressful and far more difficult than the effort is worth.

If you've got so many options, leave.

I've very happy in my new role and it took getting out of the old one to fix it.

trustfundbaby 5 days ago 0 replies      
Any time you have worked long hours it is a sign of a broken process.


I see the point they're trying to make, but this is the problem with speaking in absolutes ...

My personal preference (and I suspect other developers do this too, but I could be very wrong about that) is to work when I'm in the zone ... sometimes I can go for 8 hours, others I can go 24 hours straight without any trouble (other times I don't get anything done for a couple of days) ... during projects when I'm knee deep in building something, its not uncommon for me to do 10 - 14 hour days ... not because its expected of me, but because that's how I work.

As long your employer isn't forcing you to do death marches/ insisting you work on weekends and you're getting good rest, exercise and eating well, I don't see a problem.

rokhayakebe 5 days ago 1 reply      
To start with working 40 hours in itself is probably too much. I do not have studies to back this up, but I think after 5 working hours it is best you stop for the day.
terhechte 5 days ago 0 replies      
If I'd count all the over hours that I did for my current company over the years, I think it would be the equivalent of one year of work. And yes, most of that wasn't necessary but bad planning and a fraked up process.
ChuckMcM 5 days ago 0 replies      
The punch line is that this is how LinkedIn grew, however I know for a fact that the operations guys put in some odd hours :-)

That being said, its symptomatic. I've been places where the hours were modest and lots got done, and places where the hours were insane and nothing got done.

So the title (and the point the OP makes) don't really hold up. Perhaps it would be more accurate to say that if you can't get done what you need to get done during nominal work hourse, then one possibility is that your process is broken. Of course that isn't as impactful :-)

gxs 5 days ago 0 replies      
I don't know - and I absolutely loathe statements like these.

Sometimes, I prefer to work 18 hours in one day and enjoy 2 days free, rather than 9 to 5 it for 3-4 days. At this point, you're insulting my personal preferences, not my process.

Duff 5 days ago 2 replies      
The amusing thing to me is that the question was closed as offtopic. Stackexchange is slowly turning into a Usenet/Wikipedia hybrid.
Hisoka 5 days ago 10 replies      
It's nice that LinkedIn has a great process: regression testing and the like, but what if you work in an environment where you can't afford to test every single little thing, and where business models, let alone requirements change constantly (ie. a startup)? What if a competitor just launched a feature that will put you out of business if you don't implement the same thing in 48 hours? What if there is a mission-critical bug that has to be fixed by the end of the day or else all your customers will bail out?

Secondly, most who work in Wall Street will tell you it's not about the process. It's about the culture. People don't work until 7 or 8 because they're fixing bugs, or because development is so slow. It's because they're expected to and if they get up and leave at 5, it leaves a bad impression on management and their co-workers.

warmfuzzykitten 5 days ago 1 reply      
Process schmoces. Any time I work long hours it's a sign I want to work long hours. Sometimes you're hot and you just don't want to stop.
lwat 5 days ago 1 reply      
One of our most successful clients (grew from nothing to 200+ employees in 5 years) has always had a very strict 6pm closing time. Everyone must be out of the building at 6pm and there's no 'working from home' or 'work on the weekend' allowed.
DavidSJ 5 days ago 0 replies      
Or it's a sign that you love your job.
d0m 5 days ago 0 replies      
"Any time you have worked long hours it is a sign of a broken process." Or extreme pleasure hacking something.
phatbyte 4 days ago 1 reply      
I'm lucky, I can't remember doing overnight or weekends, but for the first time we will have to do, but I'll get paid 50% of my salary for two extra working weekends.

It really confuses to see kids with red bulls typing code all day and night. I mean, how can you think clear and be productive that way ?

You may do more, but do you do it better and deliver quality code ?

bitops 5 days ago 2 replies      
A lot of nice feel good answers on this post. But for some of us, too freaking bad if you have to work long hours.

I'm not personally a fan of long hours, but not everyone can work for a LinkedIn. And in many shops, long hours are unavoidable regardless of how much well-intentioned process is in place.

biznickman 4 days ago 0 replies      
So says employee #178 .... I'm being sarcastic, but I doubt employee #1 @ LinkedIn would tell you they worked 9-5
thomasgerbe 5 days ago 0 replies      
"Any time you have worked long hours, it is a sign of a broken process."

I hate absolute statements like these. Some of my best works have come from working long hours voluntarily.

jhdavids8 4 days ago 0 replies      
One of the worst articles I've ever seen posted on HN. I'm not going to even read through the comments, but I hope most are simply stating that this is complete BS. Otherwise, you're arguing simply for the sake of arguing. Not everything should be argued, not everything should be over-analyzed. Working long hours is often a choice; you do it to get ahead, you do it to improve your product, you do it for any number of reasons (and yes, maybe you do it because something is broken). Any argument to the contrary supporting this stupid argument is simply BS.
FrancescoRizzi 5 days ago 0 replies      
Indeed: "Late nights are a sign of scope failure. Hero mode is a sign of scope failure." (J. Fried of 37signals, from http://37signals.com/svn/posts/2185-a-new-way-of-working-a-t... )
malkia 5 days ago 2 replies      
This broken process ships a lot of games :(
ctdonath 5 days ago 1 reply      
One comment mentioned "Fizz-Buzz" which I hadn't heard of. Interesting tidbit. http://imranontech.com/2007/01/24/using-fizzbuzz-to-find-dev...
KeyBoardG 5 days ago 0 replies      
Just because LinkedIn was a success does not mean a blanket statement can be made for all. I would have titled it "All too often". Too many outside factors can have an impact. I would use Wordpress for the example of working long hours and also being a success.
forgotAgain 5 days ago 0 replies      
Goldratt saw this 30 years ago. No one has done a better job explaining why this is true.


mcculley 5 days ago 0 replies      
Certainly, if you work on an assembly line. If you work in some industry where you have to come up with solutions to problems, the workload may be more lumpy because nobody has figured out how to build a production line for it yet.
jhdavids8 4 days ago 0 replies      
"This one time, at one job I had, I was able to work 9-5 Monday-Friday. The company I worked for was successful. Therefore, ANY job that requires extra hours is the sign of a broken process."

Why would anyone hire a dude like this with such poor reasoning skills?

jowiar 5 days ago 1 reply      
The broken process may not be company-specific. We are in an industry that is driven not just by getting things to market, but by getting things to market faster than the other guy. As we all know, software does not really scale too well to adding people to the problem. Thus, being to market faster is often achieved by coaxing more work out of the same number of people, this results in long hours.

Thought experiment: Imagine some sort of truce declared among startups to skip this part of the arms race. Or, imagine a law passed capping work weeks for software engineers at 50 hours, no exceptions (again, the reason this happening by law would be to eliminate the arms race). What would it do?

patternpaul 5 days ago 0 replies      
I am surprised no one has quoted something from Steve Blank
"Work Smarter Not Harder
As I got older I began to realize that how effective you are is not necessarily correlated with how many hours you work. My ideas about Customer Development started evolving around these concepts. Eric Ries's astute observations about engineering and Lean Startups make the same point. I began to think how to be effective and strategic rather than just present and tactical."
rick888 5 days ago 0 replies      
I've seen this at companies where the boss or manager either doesn't understand the development process or just wants to make money and doesn't care. So you have situations where a feature should take 2 weeks to implement, but they want it in a week (so you need to work extra hours to make up for it).

This is one of the reasons I hate working for other people. If I'm going to be wasting my youth away for something, I'm going to be getting all or the majority of the profits.

hm2k 4 days ago 0 replies      
Maybe you're working long hours to fix the process?

That's the only reason I work long hours.

known 4 days ago 0 replies      
Not applicable if you're debugging code
Marc Andreessen on Why Software is Eating the World wsj.com
308 points by tewks  4 days ago   90 comments top 25
pg 4 days ago 4 replies      
This is actually one of the things we consciously look for: companies that are turning businesses that didn't use to be software businesses into software businesses.
gacba 4 days ago 2 replies      
You can pick at Marc's words as much as you like, but having heard his visions back in '95 when Netscape was big, he's a big picture guy and is seeing the forest for the trees.

Consider the following:

- In this decade and the last, software engineer consistently ranks in the top ten best jobs

- During the financial crash, software engineers enjoyed the least turmoil and the quickest recovery compared to almost all other sectors

- Software is mission critical to almost every business in the world now, regardless of sector

- Our jobs tend to have the highest pay among the majority of jobs (again, top ten)

I'm with Marc. I'll double down on software right now...it's not going away.

wccrawford 4 days ago 4 replies      
"And, perhaps most telling, you can't have a bubble when people are constantly screaming "Bubble!""

Oh, I bet you can.

ristretto 3 days ago 0 replies      
Add to these the recent announcement of Foxconn to install 1000000 robots. Now software will even be eating up sweatshops. Unfortunately, the rest of the economy is slow to catch up with these changes, in both the developing and (in a smaller degree) in the developed world. For the developing world this means a slump in growth until a more educated generation grows up, for the developed world, it means slow job creation. It's not a fault of technology; governments should have seen this coming decades ago. It's a shame that still, in many countries, programming is not required in primary or secondary education.

Take a moment to brag and enjoy the glory. Marc is a hero and this is an inspiring piece. Now back to work...

mathattack 4 days ago 3 replies      
As they say in the financial world, "He's talking his book."

That said, much of what he says is spot on. Software is creeping into everything. Education seems obvious. Health Care will be more difficult. A lot of the change will happen in the US.

If we can't invest money, we still can invest our careers.

quanticle 4 days ago 1 reply      
>And, perhaps most telling, you can't have a bubble when people are constantly screaming "Bubble!"

Not true. People were conscious, as early as 1997 that the dot-com bubble was just that. It didn't stop an unsustainable rise in valuations.

snowwindwaves 4 days ago 1 reply      
"Companies in every industry need to assume that a software revolution is coming."

I can't wait for the revolution to come to the control and automation industry. I can see the heritage and legacy of the (software) tools I have to use, and unfortunately they aren't so old as to have a unix heritage but to have been born in the windows 95 era.

Probably I just need to pony up and get the real good high end shit, but the automation industry is ripe for disruption like health care too. the problem is the market is small and the stakes are high, so we end with old, expensive, tried, true, ancient solutions.


DanielBMarkham 3 days ago 2 replies      
And herein lies the problem with patent reform.

The patent system is horribly broken, no doubt. But now that everything -- and I mean everything -- is turning into software, what does that mean for patents?

The capitalist answer is that we should let ideas freely grow and fight each other in the marketplace, but having an idea and selling an idea are two completely different skills. We will reward the salespeople, marketers, and business creators at the expense of the ideas people.

Perhaps that is what we want. Perhaps all ideas, not just startup business ideas, will become worthless. Execution will be the only thing that matters. If so, that's going to have some major impacts in the rest of society. It'll be interesting to watch this play out.

Joeri 3 days ago 0 replies      
It's not just about who will build the software, it's about who will use the software.

The way I see it, we're on the edge of going post-material. The trend is that the proportion of the population involved in the manufacture and distribution of physical goods is dropping. Follow that trend for a century, and you get a society where most people's jobs involve only virtual goods (although many of those goods will be turned physical by 3d printers or large bespoke manufacturing companies). Apple and google are the vanguard of the all-digital companies (apple isn't in the business of making stuff, only designing it). This means the majority of people will be making their money producing digital content, and spending it purchasing digital content. Already a sizeable portion of our income is spent on content (tv, dvd, games, books, magazines, ...). I see no reason why that trend shouldn't continue until we have a digital post-material economy.

And if we will have a digital economy, that means most people will be software users, producing content for others to buy. We're not just going to have to train the people that will make the software, but also the people that will use the software. I think the actual production of software will remain a small share of the economy.

dkrich 4 days ago 0 replies      
While I don't really agree with Marc's take on what's going on with current tech valuations, he hit on one extremely important point, and about it he is spot-on.

There is a major crisis coming in this country if the gap in skills and quality education continues to widen. Too many manufacturing jobs are moving overseas while high-tech jobs are expanding at unforeseen rates. I worry a lot about what's going to happen, and as controversial as it may sound, I think we may see a day in the not-too-distant future where the minimum wage is done away with.

As for Marc's commentary, it seems to me that every significant example he cited was related to content disruption (communications, entertainment, etc.). So I think a more accurate theme would have been "Why Software is Eating the Entertainment Industry."

jorangreef 3 days ago 0 replies      
Analogies limp:

A few centuries ago, someone may well have remarked that many of the best businesses in many industries, were moving into office buildings, an invention that was only a few decades old at the time, and that office buildings were eating the world.

1. They would have been right.

2. Moving into an office building would mean that a business was keeping up, not that it was necessarily a good business with respect to other businesses.

3. It would have been a good time to be building office buildings.

NHQ 4 days ago 0 replies      
The cat is out of the bag! What the technologists have known all along! The market is huge, and mushrooming. Only the largest tech companies can keep pace with the growth, and there is still a-plenty for today's startups. Get yours today. Seriously!

Even already-connected markets, like the USA and Europe, will mature impressively as more people get better at using an improving internet. And by the time the entire living world is connected, birth rates alone will sustain satisfiable market growth.

How's that for a pitch?

Sniffnoy 3 days ago 1 reply      
> Today's leading real-world retailer, Wal-Mart, uses software to power its logistics and distribution capabilities, which it has used to crush its competition.

Wait... others don't?

> Likewise for FedEx, which is best thought of as a software network that happens to have trucks, planes and distribution hubs attached.

Again... do their competitors really not? How can they get anything done?

georgemcbay 3 days ago 1 reply      
"We believe that many of the prominent new Internet companies are building real, high-growth, high-margin, highly defensible businesses."

What about profit? Doesn't ignoring profit (the article only mentions it in the context of Apple despite name dropping some other wildly unprofitable businesses) sort of suggest that maybe we are in fact making at least some of the same mistakes as the last bubble?

"Today's largest video service by number of subscribers is a software company: Netflix"

Netflix certainly uses a lot of software, but I think it is slightly disingenuous to paint them as a software company.

smackay 3 days ago 1 reply      
An interesting article, however the trend described might be a superficial one. Replace "software" with "paper" and the same arguments could be made for the economy 100 years or more ago. That indicates that the real driving force is something more fundamental (productivity is mentioned several times here) that could result in software being replaced with something better.
hello_moto 3 days ago 2 replies      
As someone who has been involved in software for a long time (since I was a kid), of course I love to read news like this.

Having said that, I noticed that in almost all Hollywood futuristic-theme movies, software (or any ground-breaking inventions that usually found with the help more advanced software/hardware) tend to cause problems that forces humans to destroy them and put humans back to the world pre-software.

I hope that would never happen but looking at the trend that whatever Hollywood producers imagine usually come true (even though it may take 5-10 years since the movie is out) in real life makes me scared sometime when I read news like this.

rbreve 3 days ago 0 replies      
In music its all software now, mixers, sequencers, synths are all software based. Djs use software like tracktor or serato to mix their mp3s.
garrison 2 days ago 0 replies      
> The days when a car aficionado could repair his or her own car are long past, due primarily to the high software content.

More likely, it's because all said software is proprietary. If people got the source code to their cars' computers, you'd see a lot more people repairing their cars (and a lot more interest in automobiles in the current generation, leading to real, open innovation in the space).

It's interesting to consider how cars went from being something anybody could hack on, to something that only a few "qualified" people are now able to service. I don't expect the same thing will happen with software (in other words, I don't think "trusted computing" will ever become the norm), but we must be sure that it never does if we want innovation to continue in the software space.

gills 3 days ago 1 reply      
This seems inevitable, and positive. There are some friends of mine who dislike the resulting job shedding and concentration of wealth; I am not quite sure how that will shake out.

It will be interesting to see if today's 'software' disruptors will themselves disrupted by software. Today's revolution seems to me, a changing of the guard from the massive inefficient people-driven gatekeeper to the massive and lean software-driven gatekeeper. I wonder if the evolution of this will lead to decentralization and eventual diminution of today's usurpers?

NY_Entrepreneur 4 days ago 0 replies      
He omitted a bigger, more central point:

The main drive in the economy is more productivity; the main approach there is more automation; the main approach there is computer hardware driven by software.

Next, the main point about software is that it be 'smart' enough to give especially valuable output. For that the main tool is math.

gabaix 4 days ago 1 reply      
Something I noticed: he did not use Facebook as an example, while talking about Google, Linkedin, Zynga, Apple, Amazon.

Is there a reason?

Adkron 4 days ago 2 replies      
It is amazing to see how far we have come. I'm so glad that I picked the right industry to get into. The world changed all around us, and we were a part of it.

My only fear is that this will flood the market with crapy developers. That is how this change WILL be like the DOT COM boom.

known 2 days ago 0 replies      
Politicians support software only if it furthers the interests of others to ultimately serve their own self-interest.
7952 3 days ago 0 replies      
Technology is just mimicking the rest of the economy in developed countries by moving to selling services rather than selling stuff.
metrobius 4 days ago 1 reply      
Well i guess this means that emotionally warped hypergeeks will truly inherit the Means of Production and run your life like a program---from cradle to grave. Fuck Marx and Engels, we have Andreesseeeeen and Zuckerberg.
A letter from _why aberant.tumblr.com
305 points by fakelvis  22 hours ago   92 comments top 16
aaronbrethorst 21 hours ago 4 replies      
Please add [2005] to the title. You had my hopes up there for a moment.
paganel 20 hours ago 1 reply      
> I admire programmers who take risks. They aren't afraid to write dangerous or "crappy" code.

I can still remember the advice a guy at least 10 times smarter than me gave me at the start of my programming career: "One of the most important things for a programmer to have is courage". At that time I couldn't fully understand what he really meant, I was thinking that REST vs. SOAP or PHP vs. Java or OO vs. Functional Programming were way more important for a programmer to get right compared to just having "courage". But as I grow older I realize how wrong was I.

jarin 21 hours ago 2 replies      
My favorite _why-ism was how he would hand write and scan code snippets for his blog (often without any explanation of what it did). Then, when lazy people started OCR-ing the images, he would post code as animated GIFs.

It was not only fun to look at, you actually had to type out the code yourself to find out what it did.

api 14 hours ago 1 reply      
"Until an asteroid,"

By far the best byline I have ever read.

gnufied 16 hours ago 0 replies      
Those who think somehow _why is advocating writing bad code aren't paying attention:

>Twenty lines here and there and soon people will be beating you up and you'll be scrambling to build on to those scripts and figure our your style and newer innovations and so on.

The point I think is, write (possibly bad) code and evolve. Break stuff, innovate and evolve.

ISeemToBeAVerb 20 hours ago 2 replies      
As a beginner at programming, I find _Why to be a breath of fresh air. I realize that experienced coders may berate him for advocating writing sloppy code, but for someone (like me) who is just getting into this deep rabbit hole, I find his thoughts to be encouraging.
I fully agree with some of the comments here that mention writing bad code is the only path to writing clean and safe code. I wish more experienced hackers could recall a day that, they too, wrote bad code. As a beginner, I'm positive that much of my code would make people here cringe, but hey, at least I'm learning! Ultimately, I think that was Why's point. Kids and beginners shouldn't worry if their code is "correct", they should just write code and keep learning. I think that's a noble endeavor and a great legacy.
jessedhillon 21 hours ago  replies      
Can someone write an explanation for why this is important? Who is this person?
hrabago 15 hours ago 1 reply      
First, you have to learn the rules. Then you have to master the rules. You have to really know what they're for, how they make things better. Then, finally, you can start breaking the rules.
jeff_5nines 15 hours ago 0 replies      
'Until an asteroid' is probably one of the best sign-off ever. It's true that we really don't know what's coming down the pipe for us, so code and be happy, or whatever you do, but be excited and motivated about it.
signa11 21 hours ago 0 replies      
From yosfek's site: Risk aversion is innovation aversion...
itsnotvalid 19 hours ago 0 replies      
Just remember that there are many ways to become successful. However without even trying, there is no way to learn anything.
figital 12 hours ago 0 replies      
newfound respect. thanks very much for posting this.
javadyan 21 hours ago 3 replies      
> They aren't afraid to write dangerous or “crappy” code. If you worry too much about being clean and tidy, you can't push the boundaries.

Yes, of course. You push the boundaries, move on, and at the end of the day, we have to maintain the stinking pile of "experiments" you left us with. Ugh.

oceanician 18 hours ago 0 replies      
Ahh I thought this was a letter from 'beyong the grave' rather than a historical one. Oh well. Imagine he's out there doing something clever somewhere.
billmcneale 13 hours ago 0 replies      
"I do not write tests for my code. I do not write very many comments. "

and then:

"I admire programmer who take risks"

Denial much?

By the way: this was written in 2005.

bh42222 14 hours ago 1 reply      
_why is not an average programmer. His advise is good for masters of programming. He is also a super nice guy and sounds like he thinks anyone could become a super programmer.

I don't think so. And I fear his advice will be taken most to hart by below average programmers.

Do all first links on Wikipedia lead to philosophy? matpalm.com
282 points by tbull007  2 days ago   47 comments top 22
maeon3 2 days ago 3 replies      
Community -> Living -> Life -> Physical body -> Physics -> Natural science -> Science -> Knowledge -> Fact -> Information -> Sequence -> Mathematics -> Quantity -> Property (philosophy) -> Modern philosophy -> Philosophy

It's kind of like zooming in on what it means means to be alive in this universe. The fact that it ends at Philosophy is profound glimpse into what it means to be a thinking entity in the universe.

If we ever meet Aliens from another part of the galaxy, they would no doubt form similar knowledge structures that would probably end up being exactly like this. Their Wikipedias would end at Philosophy as well.

JoshTriplett 2 days ago 0 replies      
Someone previously created a "steps to philosophy" site (http://news.ycombinator.com/item?id=2587352), but it seems to have vanished.
westicle 2 days ago 0 replies      
Project HN: Identify all non-confirming Wikipedia articles and edit them to fit the pattern.
cormullion 1 day ago 0 replies      
Then, can you get from Philosophy to Mornington Crescent?


stonemetal 1 day ago 0 replies      
When the question came up on XKCD a little while ago the answer is "no there are several loops that don't loop through philosophy". On a more conceptual level what does it mean to lead to philosophy? First links on Wikipedia do not form a tree with philosophy as the root, after all philosophy has a first link that is not itself. So we are looking at a graph and attempting to determine if all random walks of the graph passes through point P.
clemesha 1 day ago 0 replies      
Related: http://TheWikiGame.com multiplayer game of connecting Wikipedia articles with different constraints
gsivil 2 days ago 0 replies      
I was about to to link to previous discussions of the same claim/question


But then I read the article... Very nice!

blago 1 day ago 2 replies      
You can try it for yourself: http://blago.dachev.com/wikidrill
preamble 2 days ago 0 replies      
Already been done 5 months ago at http://www.xefer.com/
RobertHubert 2 days ago 0 replies      
Seeing as we humans ceated Wikipedia perhaps in an effort to define and describe everything there is, the product ends up filtered through the lenses of it's creators and in doing so we inevitably end up defining what it is to be human. I dont believe we can understand or describe anything beyond what it is we are. Wikipedia is essentially the accumulation of the collective knowledge of it's creators so what else should we expect it to be outside of the definition of what it is to be man. The attempt to collect and master the understanding of everything is afterall a philosophic endeavor.
Done babbling now lol.
RobertHubert 1 day ago 1 reply      
What about other sites like www.conservapedia.com or www.rationalwiki.org?

I tried on conservapedia and kept winding up at Earth or stuck in a loop.

RobertHubert 2 days ago 3 replies      
Just tested it for the hell of it, started with "FireFighter", thought it was random... and 30 clicks later landed on Philosophy! Fun stuff.
duien 2 days ago 1 reply      
By the time I looked at this, the end path had changed, as "Fact" now leads to "Truth" instead of "Information". How long until someone intentionally manipulates the chain?
bluekeybox 2 days ago 1 reply      
Why philosophy? If you keep clicking, you actually end up in a loop: Philosophy -> Reason -> Human nature -> Thought -> Consciousness -> Mind -> Panpsychism -> Philosophy -> ...

I'd say, of the above, "mind", "thought", and "reason" are pretty basic -- you cannot have philosophy without a mind, for one (though you can probably have a mind without philosophy).

fezzl 1 day ago 0 replies      
It even worked when I tried "Stone Cold Steve Austin"...
yxhuvud 1 day ago 0 replies      
But which of the twelve has the lowest average length? The article points to 'science', but how would the number of steps graph look then?
whacker 2 days ago 1 reply      
A lot of them do, but sometimes there are loops (Eg. Computer Science). If you make an exception, choosing the second link for example, then it will lead you to philosophy.
atomicdog 1 day ago 0 replies      
Nope. You can get stuck in loops pretty easily.
clownzor 1 day ago 0 replies      
I found a few that didn't go to philosophy back when the comic came out. My favorite: Han Solo -> Harrison Ford -> Han Solo...
RobertHubert 1 day ago 0 replies      
Start: Wikipedia -> free -> artwork -> Aesthetics -> Philosophy. 4 clicks away.
dwyer 2 days ago 0 replies      
Bob Dylan -> 1960s in music -> Popular music -> Music genre -> Genre -> Literature -> Art -> Senses -> Physiology -> Science -> Knowledge -> Fact -> Truth -> Reality -> Philosophy
p9idf 2 days ago 1 reply      
the author doesn't capitalize his sentences. i didn't find it difficult to read and only noticed halfway through the article. supposedly, capitalized sentences are easier to read, so i wonder if i've been conditioned by the internet to find uncapitalized sentences easy to read as well. off-topic, but interesting.
Your Code is My Hell avdi.org
273 points by joshuacc  1 day ago   78 comments top 17
Lewisham 1 day ago 5 replies      
I think the majority of the points here (which are good!) can be summed up with the "You are not a special snowflake" heading.

This particular form of arrogance (let's call it what it is) permeates throughout most sub-communities of CS programming. Ruby is just one tribe. Lispers are another. Game studios, web app developers, a different product team inside the same company, [insert pretty much any other here]... they all believe/rationalize that they are different to Java, and thus have nothing to learn from all those enterprise projects. Even HN has a particular anti-unit testing tribe that will turn up as soon as any article appears that says "Hey, Test Driven-Development is pretty good, huh?" who will complain that testing hurts velocity (it doesn't, see Etsy for a great example).

It's an arrogance that inevitably leads to the issues brought up in the article, taking out a huge technical debt that some other chump has to pay off. I'm not sure how this will ever change. I guess the conclusion is just to try and make sure you're not the chump.

rauljara 1 day ago 6 replies      
Ruby has such wonderful evangelists that somewhere along the line I had gotten the idea that all Rails projects had 100% test coverage, and refactoring was a matter of course. It's kind of a relief to hear that not only do not all projects live up to the ideals, some are the exact opposite.

Knowing that, maybe I can go a little easier on myself when I fail to live up to the ideals, and simply keep on striving without having to feel that I am obviously an inferior coder.

jballanc 1 day ago 3 replies      
I've had my fair share of refactoring legacy code bases as well, and I agree that Rails code is a special kind of horrible. I think you will find that this primarily traces back to the following observation:

Rails developers read GoF and think "object composition is awesome!" Then they go and create a bunch of acts_as_foo and has_bar modules which they mix into their models and controllers, all the while thinking "object composition is awesome!" Meanwhile, it never occurs to them that modules are, in fact, closer to multiple inheritance than object composition, and all the wonderful benefits of object composition that GoF touts are nowhere to be found. Instead, what's left is a giant plate of spaghetti code...

jrockway 1 day ago 1 reply      
The underlying issue is the attitude that if you manage to get some sort of process bound to a TCP port that produces HTML, you have nothing else to learn about programming. Then you join mailing lists and say stuff like "the law of demeter is bullshit" and then "unit testing is too hard, so I don't bother". But in reality, if you actually tried to make unit testing easy, then you'd independently discover the law of demeter and most of the other design patterns, and you wouldn't be telling other people that testing is too hard. It's hard because you don't care and haven't tried to do it. Or, you're bad at programming and it's time to pick a new career.

Either way, there is always new stuff to learn. It may crush your ego to learn something new, but in the end, it's a lot easier to read a book or Wikipedia article than to independently reinvent programming. Or maintain that app that's "too hard" to write unit tests for.

pavel_lishin 1 day ago 2 replies      
I've said it before, I'll say it again - in 5 to 10 years, people will view RoR as they view PHP today - as something that's ruined a lot of programmers.

How much of this sounds familiar?

> “Design Patterns are a Java thing. In Ruby you just write code.”

> “The warnings Ruby produces are dumb; just disable them.”

> In a way I think this is a testament to the power of the platform. If you're getting a 500 error in a Rails app, you can keep adding kludge after kludge and hitting “reload” until it works. No need to ever write a test or refactor. In languages and frameworks with a lower turnaround time, this kind of tweak-it-till-it-works workflow is simply impractical. Ruby on Rails has an impressively low barrier to fiddling.

olefoo 1 day ago 0 replies      
This is what happens whenever a language is seen as a hot ticket to a good job. It happened with PHP back in the late '90s where thousands of people were learning to program on customer's web projects. The problem is in some cases worse with Rails because of the attitude of _some_ of the leaders in that community which are imitated by too many of the followers; confidence does not always equal competence.
Pewpewarrows 1 day ago 3 replies      

    "Dividing methods into private and public is for control freaks, you don't need it in Ruby"

Coming from a Python dev's perspective, does Ruby have the concept of public/private? Python just has the underscore and double underscore conventions: if you absolutely have to you can use the "private" variables and methods, but in doing so you still understand that it might change without notice in future versions. "We're all adults here" is the the motto, if I recall.

peteretep 1 day ago 4 replies      
Random: I found out recently that Ruby has no unified test output standard, and that every tool makes up its own o_O

Not only this, but there is active resistance to useing something like TAP.

cageface 17 hours ago 0 replies      
My take on this is that the excesses of enterprise Java poisoned a generation of programmers' opinions about static typing. We're going to see people start rediscovering the benefits of static typing in a sane framework as more of these kinds of projects hit the wall.
wccrawford 1 day ago 1 reply      
Not sure that's a good title for the content...

But it's certainly true that sloppy code can be produced in any language.

hello_moto 1 day ago 0 replies      
Didn't Zed Shaw predict this before?

Anyhow, personal recent experience on using a plugin that is an API to some webservice:

- Lots of static methods

- Code structure is awful

- Documentation is non-existent

... and I rarely use Rails let alone learn Ruby yet I can see how horrible that plugin is.

I heard I'm not alone when it comes to the discussion of the quality of plugins out there.

I don't mean to bad-mouth Rails framework because it is a well-thought project (let's not discuss the internal code). But as many who have been in this industry for a while, we kind of know that this is coming sooner or later.

Python code, in many places, seem to have a good balance of pragmatism, UNIX culture, and discipline.

Having said that, Rails 3.x seems to mark a change in attitude from the Rails core team. They're starting to address issues and stabilize the framework for the better. Let's hope the rest would clean up as well.

3am 1 day ago 0 replies      
Worth it all for the subsection titled, "You are not a special snowflake"
jshen 1 day ago 1 reply      
I've been doing rails for a long time.

My biggest advice, use rails plugins sparingly, and when you do ensure they are well maintained. Nothing is worse than an old rails project that is using 20 plugins and you need to update it to a new major version of rails.

bitops 1 day ago 1 reply      
I wish I had 10 accounts so I could give you 10 +1s!

I guess those who forget the past are doomed to repeat it.

It's funny, principles of good programming are about as old as programming itself and yet everyone seems to feel the need to rediscover them periodically.

If you look at some great nix utilities, they should "do one thing and do it well". Hm, sounds like Single Responsibility Principle.

Dependency injection? Funny, reminds me a lot of nix pipes.

So to me the moral of the story is: programmers with experience and an open mind will tend to make good decisions. Inexperienced programmers just need time if they're willing to learn.

And arrogant jerks will always make messes.

amcintyre 1 day ago 3 replies      
From the article: "In fact, offhand I can only think of one commercial greenfield Ruby project I've participated in."

I can say the same of C, C++ and Python in the context of my personal work history. Is Avdi's experience so unusual in the Ruby world? It seems that most development jobs primarily involve working on other people's code if the language has been around for any time at all.

Edited to add: I've also seen everything listed in the "But Rails is different!" section in every commercial project I've ever worked on. I suspect most developers that just want to hack away on commercial code without any discipline end up having nearly the same set of justifications, no matter what language they're using. (Especially that bit about ignoring/disabling warnings.)

powertower 1 day ago 2 replies      
That JS on page crashes IE8.
gbog 1 day ago 0 replies      
I would be interested in a comparison with Python projects.
PyPy 1.6 Released - Full Python 2.7.1 Implementation morepypy.blogspot.com
241 points by jparise  5 days ago   50 comments top 6
lliiffee 5 days ago 4 replies      
Can I take advantage of this thread to ask the HN crowd a technical question? Some time ago, I implemented an automatic differentiation tool. Using operator overloading on a special "autodouble" type the tool would trace the execution of a block of numerical code. Then, some calculus would automatically happen, and it would output and compile fast c-code that would compute the original function and derivatives in pure c. This was great, except the c-code that was output was freaking gigantic (like hundreds or thousands of megabytes) albeit very simple, and so the c compiler would take forever to run. Sigh.

My question is: could I leverage pypy somehow to avoid this? Can I output RPython? Can I output whatever RPython is compiled down to instead? Can I do this with no more than, say, a 3x penalty compared to c?

(I apologize for asking a question only marginally related to the particular article here...)

kristofferR 5 days ago 5 replies      
In general I'm very happy with my choice of Ruby/Rails instead of Python/Django, but PyPy is one of the few things I envy Python developers for.

I wish something similar could be developed for Ruby.

sylvinus 5 days ago 0 replies      
I'm always blown away by the consistent performance gains they reach with each new version. Congrats!
voyvf 5 days ago 2 replies      
> has beta level support for loading CPython C extensions.

Is this via ctypes, or "real" support in much the same as how CPython would behave?

I ask because this is one of the features that I've been waiting (impatiently) for - I've run some Flask projects using PyPy and gunicorn, and love how fast it goes, but really want to be able to use the rest of my codebase, which unfortunately does rely on some C (and Cython) extensions. (:

sho_hn 4 days ago 0 replies      
I'm still bummed at being stuck with the dilemma of having to chose between CPython 3.x and PyPy. PyPy with Py3k support would rock.
socratic 5 days ago 1 reply      
Are there production users of PyPy?

I feel like PyPy has always been the most academically interesting Python implementation. But has it taken away mindshare from CPython?

Why every programmer should have a Tiddlywiki eriwen.com
227 points by caustic  1 day ago   132 comments top 39
kabdib 1 day ago  replies      
Another vote for .txt files.

Easy to source-control and merge, easy to find stuff.

I can't help but think that someone, somewhere has a bloated XML-based API for doing this, probably requiring a server running a couple hundred megabytes worth of Java-based back-end. I'm old fashioned.

I don't anticipate a vanilla text file going out of fashion any time in the next hundred years or so.

maeon3 1 day ago 4 replies      
I've been keeping a wikipedia on myself, _everything_ about me and who I am, what I've done and who I want to be in a wiki for the last 4 years. It has been the most illuminating thing I've ever done, and I'm so happy I took the plunge that first day. Everything is in it, taxes, dental, computers, girlfriends, projects, outcomes, desires, todo, and thousands of other categories.

Looking back on my life through the wikipedia, I see a very different person than my own memory remembers. You'd be astonished how much stuff your brain removes for lack of use. I just use bluefish editor with basic html files. I've got hotkeys to make templates and. The root node is my full name, even the universe itself falls under that category, because the only way I know the universe exists is through my observations. The most gratifying part is the page outlining my geneology, I have information documented about my DNA line going up 3 levels, and if you make a point to get really detailed information on your parents, grand parents and great grand parents (biological) You'll see health problems and DNA related hardware issues. I discovered programmers and a certain abilities/disabilities run in my mother's side, I'm a programmer. With enough analysis I could probably identify the gene sequences, and thus help my offspring by telling them what their problems are going to be before they experience them, by looking at dominant and recessive traits.

shaggyfrog 1 day ago 12 replies      
Used to use a wiki (MediaWiki) to track source material for a book with another collaborator. Worked mostly fine, except for when the DB crashed. I remember that it was hard to extract data out of it.

Mainly I've kept most of my programming notes, records, todos, research and so on in flat text files. They are easy to view, edit, backup, exchange, copy from, and version. For my use cases, I find it a more appealing technical solution to a wiki, which requires infrastructure to back it.

If someone could come up with a git/wiki hybrid, that would be interesting.

bengillies 1 day ago 4 replies      
Hi, I'm a developer on TiddlyWiki.

I'm not sure if this is the done thing or not (I'm relatively new to posting on Hacker News so apologies if not) but if anyone wants an online version of TiddlyWiki (that handles revisions, multiple users, has a RESTful API, etc), then we're currently working on a service called TiddlySpace (http://tiddlyspace.com), which is basically an extensible online note taking app based on TiddlyWiki. It's on GitHub too.

kalid 1 day ago 3 replies      
For personal notes I like Notational Velocity (http://notational.net/), saved as plain text files in my DropBox, and linked with SimpleNote (http://simplenoteapp.com/) for iPhone syncing.
mgualt 1 day ago 1 reply      
These periodically recurring organization threads always motivate me to take another stab at adopting some kind of coherent approach to organizing my ideas.

Then I get depressed at how poorly thought out almost all the options are. I then become convinced that the only system that has been fairly well thought out is org-mode.

Then I fire up my emacs and the incredible frustrations and overwhelming disgust start to rise like a slow flood.

I load up my .org file and it's not wrapped. I can't read it. So I try to change the options. There are 5 incomprehensible word wrap options and none of them work. Then when I re-load my .org file, it hasn't saved the word wrapping setting.

I can't remember if it's shift-tab, alt-tab, command-shift.... one of them puts a TODO, another puts a new star, now suddenly I've reordered half of my lists...

Meanwhile, once I move a paragraph, emacs displays some kind of fragments of the characters in the previous location... what the hell is this?

I decide that I'd like to collaborate on a project -- org-mode seems like a great tool. But wait, I forgot that my collaborator is actually a NORMAL person. What is the chance that they will read the org-mode manual? Oh that's right, I'll just send them to the IRC channel.

Where was the calendar again?

I am literally in tears now. tears. This is the best personal organization software we have. It's 2011. Orgmode looks like my father's 1985 IBM terminal. Why do we always have to wait for Apple to inject a miniscule amount of actual DESIGN into software?

Please help me...

follower 1 day ago 0 replies      
For a number of years I used the single file Python wiki pwyky (http://infomesh.net/pwyky/) running on a server to serve as a personal lab notebook.

I added general project notes, links to relevant resources and a chronological log of projects.

The biggest benefits for me were:

  * keep track of multiple projects;
* reduce overhead of switching between projects.

There was, however, also a benefit for the wider community as well, because all of the pages were public--even the in-progress projects--people could make use of the knowledge I'd found so far, even if the project itself wasn't (ever :)) entirely documented.

When the server hosting my wiki, err, disappeared, I decided to create a service that would provide me the benefits I'd found and also make them available to other developers.

Labradoc is the result:

  * http://www.labradoc.com/

With Labradoc you can:

  * make general project notes with Markdown formatting;
* keep a chronological log of project progress.

I posted a Show HN a couple of months ago: http://news.ycombinator.com/item?id=2669425

Here's my own project list:

  * http://www.labradoc.com/i/follower

If you don't already keep a project log I encourage you to do so.

If you already keep a project log I encourage you to make more of it public (it's okay, the people that judge you badly for in-progress hacks you don't want to work with anyway :) ).

In either case, try out Labradoc and see how it suits your workflow.

xbryanx 1 day ago 1 reply      
I'm a big fan of using Notational Velocity for just these same reasons - http://notational.net/
barrkel 1 day ago 2 replies      
I spent a day a couple of years ago writing an app to solve this kind of problem, and more. File format is plain text files stored in directories. Every directory becomes a tab; every file becomes an entry, with the first line in every file being its title. Along with plain text files, .log files are kept; these keep a log of every edit to the .txt file.

Nothing is ever deleted. No edits are ever lost; the whole undo stack is always there, and if you undo to a particular point and edit it, it just becomes a new revision on top of the old undo stack. There is no save / load - the app uses the directory it was started from, and saves when no edits have occurred for 5 seconds.

It's a combination of notepad, brainstorming log, todo list, and general notes. I keep it in sync across multiple machines with dropbox.

gurraman 1 day ago 1 reply      
I write my notes in markdown and keep them in a git repository. A simple hook generates a collection of html documents that I host on my private server for reading while commuting.
enjalot 1 day ago 0 replies      
I've been using google docs for these kinds of notes, searchable, shareable and accessible anywhere I can get to gmail.

I may consider dropbox + txt files, but gdocs is working fine so far.

jarin 1 day ago 0 replies      
I use Soywiki: http://danielchoi.com/software/soywiki.html

Toss something like this in your .bashrc or .zshrc to run it in MacVim and have an easy command:

  export SOYWIKI_VIM=mvim
alias wiki='cd ~/wiki; soywiki'

Then just git init and push it up to a private GitHub repo, and you can check it and edit it on the go through the GitHub site.

aashay 1 day ago 2 replies      
I find Evernote/Simplenote more effective for brain dumps. Evernote is especially useful for storing diagrams and photos of whiteboard diagrams.
endlessvoid94 1 day ago 0 replies      
I'm hoping Asana largely solves these problems. I've been using it for about a week, just for personal projects and my work tasks, and it works wonderfully.

I can put links in comments and descriptions and easily reference them later, in a really usable way.

ja27 13 hours ago 0 replies      
Yesterday I started a comment asking if I'm the only programmer that doesn't keep many notes. I actually don't keep that many, relying on my memory and falling back to code comments, revision history, and email when that fails.

But then as I read through more comments, I realized that I've tried almost every one of these note systems - Tiddlywiki, TODO.txt, Evernote, Springpad, OneNote, Moleskine notebooks and others - but have never stuck with any of them. The only system that's ever stuck for me is email. I keep trying to get better at recapping meetings and decisions in email so I can find them later.

steve-howard 1 day ago 0 replies      
OneNote (part of Office since, I think, 2007) is actually great for keeping structured notes on meetings, projects, etc. I know a lot of programmers don't run Windows, but if you're not fazed by letting Microsoft handle the details (i.e. your stuff gets put on SkyDrive under a Live account), it's not a bad way to go. I set it up on my work machine after I'd been running it on my home computer for a while, and I was pleasantly surprised at how easily it loaded the notes I've had from past projects.

Disclosure: Microsoft intern (in a completely different division, and I've been using it for longer than I've been here). My opinions are mine. :)

runjake 1 day ago 0 replies      
I find Tiddlywiki extremely cumbersome and awkward.

Instead, I use Simplenote; Flick Note on my Android device, Resophnotes and SyncPad for Chrome on my PC.

_corbett 1 day ago 0 replies      
I worked for a year at a company and training two people to do my job from a stack of my personal notebooks and insider knowledge was a nightmare. After that I decided to keep a personal wiki which was easily searchable. Having gone the .txt route in the past and found it wanting, I decided on the next best thing and went with a Tiddlywiki.

I document all processes I develop and additionally keep a journal every day, and one at a project level. It has more than once saved my butt working on multiple projects and has made me a better manager. I finally graduated from a Tiddlywiki as I ended up wanting to collaborate on entries far too often. A year into my wiki experiment I switched to MediaWiki and haven't looked back as I can simply add collaborators and more transparently keep track of contributions.

For others wanting to transition I made a little conversion script here: http://www.itp.uzh.ch/~corbett/projects/code/tiddlywiki_to_m...

Tl;dr Keeping a wiki as a developer: ~best decision ever.

mhd 1 day ago 0 replies      
Personally I'm using Dropbox for stuff like this, using PlainText on iOS devices, Notational Velocity (well, NvAlt) on my Mac and increasingly [Deft](http://jblevins.org/projects/deft/) from within Emacs.

I know, there's Org Mode, but I'm more a free-form guy, same reason why I never used one of those monstrously popular GTD applications…

mkramlich 1 day ago 0 replies      
I have a collection of scripts, dotfiles, text note files, flat text data files, and ad hoc lightweight textual databases already to do this sort of thing. Doesn't require any special software, doesn't require a hosting server or active network connection, and yet I can edit, search, compress, refactor, version, backup and synch this data anyhow I want, anytime, essentially for free, and with an absolute minimum of moving parts. And using the same set of CLI tools I already must or at least should know anyway for my profession.
hboon 1 day ago 0 replies      
I've since moved on to plain text files, but at my last job, when I left, I just had to delete a few more sensitive notes before passing my TiddyWiki file to my replacement and he had all my notes for the job. It was a technical sales role. I'd argue that a personal wiki is even more important for many other vocations than for programmers.
aufreak3 17 hours ago 0 replies      
I use fossil (fossil-scm.org) for the same purpose. It is super nice that the wiki is integrated with the source code I'm working on and the bug tracker.
melling 1 day ago 0 replies      
I mainly use org-mode for detailed notes. Github (kind of) understands them so I now include README.org files.

I also recommend http://workflowy.com for capturing information. I'd pay for an iPad app that would sync with it.

katieben 1 day ago 0 replies      
I use MediaWiki now, I love it. It's free, and has really helped my organization. I use it for everything - passwords, API keys, links, random code snippets, etc. It's better than a text file to me, because both the structure and formatting ability encourages properly organizing data.

Though, it's most important to just have SOME well-greased system, no matter what the platform. If it's a text file that does it for you, great. If you spend 10 minutes trying to find a password - time to get a system.

mhb 1 day ago 0 replies      
Garrett Lisi's An Exceptionally Simple Theory of Everything TiddlyWiki:


jianshen 1 day ago 0 replies      
My favorite use of TiddlyWiki was making it the index.html in my home/public_html directory on our internal company server.

I had network write access via network mount on my laptop and whoever else who just wanted to browse my notes could goto http://internalcompanyserver/~myname/

orph 1 day ago 2 replies      
I wrote a personal wiki for Linux called Tomboy.

I've been working on Hackpad.com lately. It's a realtime wiki based on Etherpad. Pads are private by default, and you can invite people to edit them later.

I think it's better than tiddlywiki. Why not try it out?

gaving 1 day ago 0 replies      
Personally use github's gollum: https://github.com/github/gollum which is pretty awesome and git-backed.

Allows you to do all your wiki pages in markdown and a sleugh of other markup formats. Plus, did I mention git backed?!

hsmyers 1 day ago 0 replies      
Took a nibble sometime back and was surprised by how well it did what it promised---that however was the problem. Hard to adapt to someone else's idea of a good time. I found it easier to revert to previous home grown methods and madness...
flarg 1 day ago 1 reply      
I use Zim Desktop Wiki - more reliable than Tiddlywiki, no need for a browser, version control
invaders 1 day ago 1 reply      
I would like to recommend this freeware lightweight structured text file editor. Definite upgrade over plain .txt files yet still extremely simple in every way. No browser/3rd party tool involvement needed. Its file format is human readable for the most part.

It supports all kinds of links in text: local, network files, internal pages (for wiki emulation). You can drag and drop links from web browser too.


I never stopped using this program since I found it several years ago.

eridius 1 day ago 0 replies      
Any Mac users who are interested in having a personal wiki should take a hard look at VoodooPad (http://flyingmeat.com/voodoopad/).
lobo_tuerto 21 hours ago 0 replies      
Can't save on latest dev chrome (15.0.854.0 dev), getting this error:

The original file '/home/yop/Documentos/wiki2/empty.html' does not appear to be a valid TiddlyWiki

detour 1 day ago 1 reply      
I use Workflowy to do this stuff.
cabalamat 1 day ago 0 replies      
I use MediaWiki myself.
rshm 1 day ago 0 replies      
i use .txt
zachsnow 1 day ago 1 reply      
Here's a vote for Instiki (http://instiki.org), a super simple wiki clone written in Ruby (that's pretty straightforward to hack on, which is a pretty important consideration when using wikis for 'non-traditional' purposes).
baghali 1 day ago 0 replies      
+1 Evernote
SalmanPK 1 day ago 0 replies      
blah. Blogging FTW :)
Samsung cites Kubrick's "2001" as Prior Art in Patent Case Against Apple macrumors.com
213 points by dvdhsu  9 hours ago   76 comments top 16
ender7 8 hours ago 5 replies      
Poor Gruber. Your favorite director's movie used against your favorite company.

It's impossible to post a serious comment to a case that is already absurd.

alexqgb 6 hours ago 0 replies      
(From the OP's comments / reposted on account of sheer awesomeness)

Steve: Stop selling the Galaxy Tab!

TAB10.1: I'm sorry Steve, I'm afraid I can't do that.

Steve: What's the problem?

TAB10.1: I think you know what the problem is just as well as I do.

Steve: What are you talking about, TAB?

TAB10.1: This mission is too important for me to allow you to jeopardize it.

GeneTraylor 8 hours ago 2 replies      
There actually might be a precedent over here.

Heinlein came up with a "waterbed" for a few of his books (it was mentioned quite prominently in Stranger in a strange land) Later on a man called Charles Hall tried to patent his design, and he was denied the patent by the USPO on the grounds that Heinlein's description in Stranger in a strange land, and Double star constituted prior art.

It would be fascinating to see if this precedent is followed...

rwolf 9 hours ago 6 replies      
I was watching an episode of Star Trek: The Next Generation last night, and was surprised to see one of the characters carrying around a tablet computer.

I can believe that finding visually-pleasing ratios of screen/margin, height/depth is a challenge, but the basic outline does not seem to be hard to imagine.

kemiller 2 hours ago 2 replies      
A rectangle may be a rectangle, but the fact is that no one actually made a working and workable device in this form factor before Apple. If 2001 and Star Trek are prior art, then where were the hordes of copycats before 2010?
rodh257 6 hours ago 0 replies      
I think the point to take away from this is that the similarities in design between the iPhone/iPad and their competitors has a lot to do with the fact that really all they are is 1 big screen, and their current designs are the logical design for them. There's really not too much you can do to differentiate other than put your logo on it, change the buttons, change the back, make it bigger, make it square, and Samsung did all except the last one.

I won't doubt that their design job was made easier by the iPhones existence, but I don't think they set out to intentionally trick users into thinking it was an iPhone/iPad.

antirez 7 hours ago 2 replies      
the case is absurd and I'm all against Apple position, but in the video from 2001 it is unclear if we are seeing tablet PCs or simply two integrated flat TVs in the table. The problem is that the two devices are positioned exactly in the same way, specularly.
mikecane 4 hours ago 0 replies      
FWIW, an episode of the TV series Logan's Run once showed a young woman using a color paint program on a flat tablet that was wirelessly transmitting to a wall-mounted screen. Today we'd say that was a Wacom tablet with a Bluetooth connection to an anybrand HDTV.

And let's not forget the word "art" in "prior art."

xutopia 9 hours ago 5 replies      
Can patents be invalidated like this? I'm surprised this is the case. Makes the whole thing that much more absurd.
teilo 7 hours ago 2 replies      
I distinctly remember a Bill Gates lecture from the '80s, with accompanying video, where he presents a future-concept tablet device, with touch screen, thin, narrow margine, etc. It had an AI-based personal assistant that accepted voice commands, and a built-in camera, with Facetime-style chat.
joegaudet 2 hours ago 0 replies      
If this were allowed it would be quite the interesting blow to patent law. Much of what is _modern tech_ could be attributed to Sci-Fi: Cell Phones, Speech Recognition, all kinds of transportation.

I for one hope this does work... Primarily because I love 2001.

mobileman 8 hours ago 2 replies      
Based on this discussion, it would behoove innovative people to start writing fiction now
and design the future.
juiceandjuice 6 hours ago 1 reply      
Samsung hOzc
Tichy 8 hours ago 3 replies      
Maybe Apple does not even believe they invented the tablet. But they have to act as if they do, because it feeds their cult following, which sells devices at the end of the day.
Women are rejecting marriage in Asia. The social implications are serious. economist.com
209 points by rblion  4 days ago   204 comments top 16
lionhearted 3 days ago  replies      
I have a theory that I haven't found expounded before. It came from a combination of travel through 60+ countries, living and working and interacting with local people on a pretty intimate level sometimes, and study of lots of history.

It's going to be controversial and maybe even shocking, so brace yourself for a moment before reacting please.

I think peaceful societies self-destruct.

With a few notable exceptions that require a geography suitable to isolationism, long term peace has historically been achieved through your country or one of your ally's having military supremacy over the rest of your neighbors.

Obviously, diplomacy can keep the peace for long periods of time, even human lifetimes, but eventually incidents happen when there's a hothead in one government, and then that's when the military supremacy determines whether you get attacked or not.

Anyways, I've found the more a country renounces war and gets further away from it, the more birth rates go down. You get an explosion of commerce and art for ~30 to ~70 years, and then the society self-destructs.

No longer forced to confront mortality and with no externally unifying cause, people start living for luxury, pleasure, and consumption. They stop having children. Birth rates fall off.

Eventually, this destroys a country's economy, the military supremacy fades, and one of their neighbors comes in and cleans house, and the cycle begins anew.

This has happened many times through history. It's happening in Japan right now. If I became an advisor to anyone in the Japanese government, I'd advocate two things as chief priorities - (1) exceedingly good relations with China, and (2) re-militarize.

Then join the next war they can on America's or China's side. Combined with some standard messages of nationalism/strength/growth/unity, birth rates would almost certainly increase.

emanuer 3 days ago 5 replies      
I am married to a Japanese and live here right now, so I can only comment on the situation of the country.
I will try to give an economical assessment of the situation. As you might be aware Japan had a “lost decade” during the 90s. It experienced almost no growth of the GDP. There are many theories as to why, but the most convincing one is: The Japanese “Baby Boomers” retired during this decade. In fact the percentage of workers in the population (age 15-64) decreased by 5.6% during this time, as the number of retirees increased by 10.7%. http://goo.gl/kVZxB

What does this have to do with anything? / What happens when you have fewer people in the expected working age?
2 things happen:

  1. More people have to work
2. People have to work longer

The official retirement age for Japanese is 63/61 (M/W), the effective retirement age is 69.5/66.5 http://goo.gl/s9slK (very interesting graph)

My wife held a managerial position before our son was born, amongst her friends hardly any of them want to stay single, they simply don't have a choice. If they want to sustain their current life standard, marriage would be impossible. One man working has a hard time providing for his wife, kids, paying for a decent home & the mandatory elderly taxes. The taxes are almost 2000 USD per person / year, no matter if employed, or not. And those are just to pay for the pensions; insurances, social etc. come on top of that.

Many women still life / moved back in with their parents, even in their late 40es. It is not because they want to, or cannot find a partner (many of them look quite stunning). The pensions simply don't suffice and they feel obliged to support their parents. The same holds true for many men.

I very much disagree with the statement that women in Japan enjoy their single life so much, as they choose not to marry. I have yet to meet a Japanese woman, who will state this. Compared to Europe, where this kind of ideology is quite common.


Japanese women are expected to do 90% of the housework, where as American women will “only” do 60% of the work. http://goo.gl/qj64W This fact and the very big distance Japanese develop for their spouses are certainly not helping. http://goo.gl/QlAfx Surprisingly no women I ever conversed with, complained about this lack of love between a married couple, on the contrary it is expected by the women. And it often leads to problems in marriages with foreigners.

Edit #2:

If my experience with the Japanese Culture is in any way representative employing more men in the military will be the worst thing imaginable. The country has a deficit of 180% /GDP and every person in the army is one person less doing productive work to keep country afloat.

Edit #3:

When the "Baby Boom" generation will retire in the United states, do you expect the country to sustain a positive GDP? This would mean that every person working will have to work harder and produce more just to reach 0 growth. Something the Japanese managed to do. The only solution to this is immigration, something Japan is battling against.

gamble 3 days ago 0 replies      
A friend of mine has lived in Japan teaching one-on-one English lessons for a number of years. A good chunk of his customers are recently married women. Their husbands don't want them to work, but with nothing to do at home they end up taking English lessons out of a desperate need to find something mentally stimulating to pass the hours.
burgerbrain 3 days ago  replies      
Good for them. More people need to reject this archaic sexist tradition. Legal enforcement of the fantasy notion of "true love for life" is damn near barbaric.

If you and your partner can swing it, more power to you, but social pressure on others to place themselves into legally binding situations revolving around this notion is something that need to die.

In the west (at least the states) these are legally binding arrangements that are heavily biased against males, but if women are rejecting that in Asia then all the better.

NY_Entrepreneur 3 days ago 1 reply      
Seems to me that the article missed the most central point. Let's start with a little, simple, relevant background:

(1) In the past, e.g., in tribal or agricultural communities, women had children whether they really 'wanted' to or not.

(2) Now in more industrialized countries, a significant fraction of women have some options. Some women still have children but some women do not want to have children and do not.

(3) The change in number of children per woman is a 900 pound gorilla in the room: The article mentioned numbers under 2 children per women with a rate a low as 1 child per woman. There is some recent data that Finland is at 1.5 children per woman.

Of course, for any rate much under 2, each 20 years or so the population will be going down significantly. "Get your old houses, furniture, dishes, baby clothes, etc. cheap, cheap, cheap!".

So, net, heavily women who don't want to have children won't. These women will be 'weak, sick, dead limbs on the tree' and will be pruning their genes from the tree. What will be left are women who, given the choice, actually, effectively WANT to have children.

The big point: After a few such generations, we will be left with a much smaller population with many fewer women but nearly all of whom WANT to have children. Then the population will start growing again.

One more big, surprising point: We are now, in much of the world, in the most rapid change in the gene pool of the last 40,000 or so years.

Where did the 40,000 come from? Humans walked out of Africa back there somewhere. At one point, ballpark 40,000 years ago, they reached, say, India. One branch went west to Western Europe, and another branch went east to China and Japan. Mostly the two branches haven't much mixed since then.

Okay, at the common branch, ballpark 40,000 years ago, what were women like? Well, take women from Japan and women from Western Europe. Take some 'characteristic' in common, say, desire of a major fraction of the women, given an economic opportunity, not to have children. Now, start in either Japan or Western Europe and count genetic 'changes' on this 'characteristic' going backwards in the tree to the common branch, about 40,000 years ago, and, then, continuing to count changes, going forward in the other branch of the tree to the present.

So, if on this characteristic the women in Japan and Western Europe are close, that is, have few changes, then on this characteristic the common ancestor 40,000 years ago has still fewer, that is, is closer to the women in both Japan and Western Europe than they are to each other.

Net, since a significant fraction of women in both Japan and Western Europe will, given an economic opportunity, choose not to have children, that is, these women are close to each other, both are still closer to their common ancestor 40,000 years ago. So, for 40,000 years, many women had children not really because they 'wanted' to but because of economic necessity.

So, now that women who don't really want to have children are being pruned from the tree, we are, on this characteristic, in the most rapid change in the human gene pool of the past 40,000 years. And, the change is VERY rapid, should have a huge effect in just a few generations, say, just 100 years, which on the scale of 40,000 years, is FAST.

thevivekpandey 3 days ago 2 replies      
"So far, the trend has not affected Asia's two giants, China and India."

It is not fair to label a trend as "Asian" when you need to exclude China and India.

diN0bot 3 days ago 1 reply      
"Japanese women, who typically work 40 hours a week in the office, then do, on average, another 30 hours of housework. Their husbands, on average, do three hours."

"Marriage socialises men: it is associated with lower levels of testosterone and less criminal behaviour. Less marriage might mean more crime."

bennesvig 3 days ago 3 replies      
George Gilder's "Naked Nomad's" is a great book studying single men in America. Unmarried men own the majority of bad statistical categories to be in. Higher death, suicide, crime, and disease rates. No society wants that burden.
EGreg 3 days ago 0 replies      
Yes, an aging population might be worrisome on the surface, but I find overpopulation more worrying. Therefore, I am happy that people in really populated countries are having less babies overall!


Longer (but more boring) version:

peteretep 3 days ago 0 replies      
> in Bangkok, 20% of 40-44-year old women are not married

In Bangkok, 20% of the women you see in couple on streets are part of a woman-woman couple, holding hands - usually one with short hair and a shirt, and one with long hair and a dress - something I've never seen as prevalent anywhere else in the world, and something the locals don't look twice at.

wyclif 3 days ago 1 reply      
It would have been useful if this article had taken into account the fact that the laws of certain SE Asian countries (i.e., the Philippines) do not provide for divorce:


bobo888 3 days ago 1 reply      
Is it really the women who are avoiding marriage?

Do this experiment: go to a newspaper stand and look for magazines about weddings. Look for books about marriage. I have never ever seen one addressed to men (there are ones which seem to be, but on a quick glance I actually think they were written to feed the women's ego), which means IMHO that men wouldn't spend money on subjects like these. Meanwhile you sholdn't be surprised to find at least half a dozen for women. So I really really doubt that men are more willing to marry EITHER.

I would actually dare to say that men were (and still are) the ones who don't really care about marriage.

rednaught 3 days ago 0 replies      
Maybe the Philippines and Indonesia don't represent most of Asia, but illegitimacy has become a very obvious problem in the last decade there. The only reason divorce is not more common is because an annulment in the Philippines represents an astronomically prohibitive barrier for most citizens.
known 3 days ago 0 replies      
I believe marriage and career are mutually exclusive for women in Asia
Shenglong 3 days ago 1 reply      
I smell a business opportunity.
Qa8BBatwHxK8Pu 3 days ago 0 replies      
Good thing. But being gay this never bothers me.
Introducing WebAPI mozilla.org
201 points by sylvinus  17 hours ago   40 comments top 11
dave1010uk 16 hours ago 2 replies      
It would be great if Mozilla worked with PhoneGap on this as they already have APIs [1] that cover most of these and have pretty good cross-platform support already [2].

[1] http://docs.phonegap.com/

[2] http://www.phonegap.com/about/features

dotBen 13 hours ago 3 replies      
As much as I support what they are doing, I don't see Apple implementing any of this any time soon - even if it is in the Webkit trunk, they'll just rip it out/deactivate it.

Apple's strategy for their phone platform runs deeper then just what is in the open source Webkit codebase.

I used to think/hope open platforms would ultimately win in the market but iOS and the Apple App Store has kind of proved otherwise.

MatthewPhillips 16 hours ago 0 replies      
coderdude 15 hours ago 1 reply      
>>Mozilla would like to introduce WebAPI with the goal to provide a basic HTML5 phone experience within 3 to 6 months. [...] Specification drafts and implementation prototypes will be available, and it will be submitted to W3C for standardization.

3 to 6 months is a fairly short timeline for getting all browser vendors and the W3C on board, not to mention getting stable and secure implementations into the wild. At least they're doing all the legwork by putting together the initial specification drafts and implementation prototypes. These additions would be truly incredible if accepted. I know they like to lump every standard under HTML5 these days so I wonder if this is intended to be an addition to that or if WebAPI is completely orthogonal to HTML5.

drivebyacct2 12 hours ago 1 reply      
I would prefer that Contacts where not an API and just a user created/chosen web app. Why is there a need for a Contact API?

On a separate note, should someone be starting up a OMA push proxy for SSE? I have a feeling that between Chrome getting an Android port, the Moz/Chrome WebRTC and Moz's Web APIs, that Server Side Events are going to become "a thing" and OMA push proxy would be a useful service.

robjohnson 14 hours ago 0 replies      
I think the biggest issue is that you would need to get buy-in from the major manufacturers and it doesn't seem to be in their best interest. The further divergence of the platforms, in tandem with the conscious raising of switching costs, is one of their main strategies.
CGamesPlay 11 hours ago 1 reply      
This appears to be very close to a direct competing standard (standard proposal) with Google's WebIntents:



mkramlich 6 hours ago 0 replies      
good idea. horrible name.
Sembiance 12 hours ago 0 replies      
Sums this up nicely: http://xkcd.com/927/
mgutz 13 hours ago 0 replies      
It would be great if Mozilla followed through on their projects. Prism to Chromeless to stalled. Chromeless has potential to expand their core product, which is the browser. Why does everyone have the need to be Google and have their hands in everything?
BrandonSmith 15 hours ago 2 replies      
Frankly, this is Mozilla's reaction to Google's http://www.webrtc.org/

To put it in perspective, Mozilla has no mobile platform and is crying for relevance.

Apple -> iOS/WebKit
Google -> Android/WebKit
Mozilla -> ???/Fennec

By getting desktop browsers to implement telephony and other collaboration APIs, I'm sure their hope is to get mobile browsers to likewise expose the precious voice and video APIs of the mobile handsets. And perhaps by doing this, lines between browsers on mobile devices are blurred to where Fennec has a role.

Best algorithms book I ever read eriwen.com
195 points by slashams  1 day ago   42 comments top 15
kenjackson 1 day ago 1 reply      
I also highly recommend Sedgwick: http://algs4.cs.princeton.edu/home/

I haven't read the new edition, but his older editions were very accessible. More for practitioners. For example, I don't think there were many, if any proofs (but I may be way wrong on that). A good amount of actual code though.

lawnchair_larry 1 day ago 3 replies      
> The only caveat here is that most of the examples are written in C, which can be troublesome if you don't know or have forgotten about pointers.

It is hard for me to take advice seriously from someone who says this.

webspiderus 1 day ago 1 reply      
I read Skiena's book as preparation for interviews after having gone through a course using CLRS the year before, and I definitely feel like that is the best way to do it. I feel like reading Skiena got me back to mostly the same place I was last year after being waist-deep in CLRS, but I do feel that a more thorough examination of algorithms is needed to really grasp a lot of the nuances. it's perfect for dusting off the cobwebs though!
cageface 1 day ago 2 replies      
I think it's funny that people still think walking people through algorithms on a whiteboard is a good way to hire programmers.
shefield 1 day ago 1 reply      
I've read Skiena's book(first part) and CLRS(about 15 chapters) and I think CLRS has much more to offer. The number of algorithms, rigorous proofs, and explanation that CLRS offers are not comparable with Skiena's.
I would suggest Skiena's for start but for a more thorough understanding a switch to CRLS is essential.
absconditus 1 day ago 1 reply      
Which other algorithms books has the author read?
dougb 1 day ago 0 replies      
This is a very good book. Worth the money.
Skiena has a companion website with a lot of good info too,

I loved the Stones quote.
The book does have some bugs, if something doesn't look right, check the website errata.

conradev 1 day ago 1 reply      
I read a different book to learn algorithms, 'Algorithm Design' by Kleinberg and Tardos, and I think its a fantastic book, with lots of sample material that actually makes you think. I actually may try this book to see how it compares.
Locke1689 1 day ago 0 replies      
To be honest, I found Skiena's book a bit too introductory. I'm doing my preparation for interviews right now and I think I'm going to try to use TAOCP as my algorithms book. For the programming part I'm not sure if any book is going to help me. I tend to think that reading books rarely helps with programming -- only programming does.
matan_a 1 day ago 0 replies      
Great book! I enjoyed it as well.

It provides an excellent middle-ground. It's not as hard-core as Cormen's "Introduction to Algorithms", which might be a good thing for some people (like me). I find that for myself, I can retain more information when i have a real world application to attach it to and this book does a great job providing those.

rue 15 hours ago 1 reply      
Doesn't seem to be available as an ebook, a strangely common problem for “hard” CS texts.
neuromage 19 hours ago 0 replies      
The chapter on dynamic programming alone is worth the price of this book. Seriously, if you have trouble with DP, I can recommend no better way to learn it than going through Skiena's chapter on it. That catalogue of problems at the end is spectacularly thorough and useful too.
mtogo 1 day ago 0 replies      
I agree, I bought this book a while ago and it's been one of my best purchases so far. A really great resource.
kitsune_ 1 day ago 0 replies      
I own this book. Recommended. It's much gentler (not necessarily easier) than most algorithms books.
ramiyer 1 day ago 2 replies      
Skiena is an awesome professor too and teaches in SUNY StonyBrook but the sad part is he never teaches SB's algorithm course. Its an amazing book and recommend everyone interested in algorithms to read it.
A Simple Explanation for Why HP Abandoned Palm daringfireball.net
193 points by A-K  2 days ago   96 comments top 20
acangiano 2 days ago 4 replies      
I think his interpretation of the facts is very plausible. Sadly, Larry Ellison was absolutely right when he said, at the time of Hurd's scandal, "the HP Board just made the worst personnel decision since the idiots on the Apple Board fired Steve Jobs many years ago."

Good CEOs who can revamp stagnant companies are hard to come by, particularly in the consumer space. Hurd was that CEO, Apotheker is not. And getting out of the consumer space, at a time when there are several paradigm shifts going on, means missing huge opportunities.

cletus 2 days ago  replies      
I tend to agree with the basic reasoning but there's one thing I don't understand.

Why cancel the TouchPad? Why not just spin that out with the hardware business to Compaq (which is the idea, yes?)? It seems they've devalued that unit by killing WebOS.

All PC makers are on razor thin margins... apart from Apple. A post from yesterday [1] painted an interesting picture where Dell, through a series of seemingly rational decisions, essentially taught Asus the PC business, allowing the Taiwanese manufacturers to eat US PC makers for breakfast in later years.

[1]: http://news.ycombinator.com/item?id=2907187

dgreensp 2 days ago 0 replies      
I have no faith in HP's leadership.

I was talking to an insider last year about how HP wanted to be a computer maker. I said, "I know them for their printers, they make awesome printers." He said, "Yeah, they're not so interested in printers these days. They think the money's in computers."

I guess it's whatever the "money's in" this week.

sek 2 days ago 3 replies      
WebOS was for Hurd just a fig leaf to make a future orientated impression, in reality were all the profits he made short term.

What HP needs now is focus and that is what Apotheker is doing. Some people are disappointed that it is not consumer orientated, but when you look at the "smart phone wars" this seems reasonable to me.
HP can't compete with Google, Apple and Microsoft there. Look at Nokia and Blackberry.

nikcub 2 days ago 1 reply      
The HP board made the decision, and I am surprised that nobody has yet made the link between Andreessen's editorial in the WSJ (he sits on the HP board) and this recent shakeup.
rwmj 2 days ago 3 replies      
This quote: "Autonomy " a company I'd never heard of before but which more or less sounds like a rival to SAP" tells you everything you need to know about the blogger. He knows very little about the software business, and is just making up opinions based on his gut feeling, without backing it up with knowledge or evidence.
forgotAgain 2 days ago 0 replies      
HP also wants to be in the cloud. After the past week I think they've killed that as well. Who would lock themselves in to HP at this point?
dr_ 1 day ago 0 replies      
Just because Hurd decided to bet big on WebOS - it doesn't mean that bet was ever going to pay off.
Gruber is correct and HP and Apotheker are going along with their plan as intended. The tablet/phone business was a long shot and their PC business over time was going to get weaker, they don't own the OS and hardware is tough to compete in with Chinese manufacturers. Their best shot at long term survival was enterprise software, and that's what they are doing.
Perhaps they could come up with some kind of enterprise solution for WebOS in the future, but it's probably not their focus right now.
Tichy 2 days ago 0 replies      
That's stating the obvious. Another question would be, why was Apotheker hired? Whoever did that must have expected something like that?
ayanb 2 days ago 1 reply      
HP has a highly distinguished board of directors. I find it strange that no one is highlighting the fact that Léo Apotheker is definitely enjoying the vote of confidence of the top level executives, otherwise this would have never been possible.
nivertech 2 days ago 2 replies      
"Autonomy " a company I'd never heard of before"

Autonomy is a leader in their field

daimyoyo 1 day ago 1 reply      
It's really too bad. I really wanted webOS to gain some traction but now that HP has all but killed it, it seems incredibly unlikely that webOS will be anything more than a footnote in history.
llambda 2 days ago 0 replies      
I guess this is why the vision of the CEO matters so much to investors (think Jobs). But regardless, HP was in trouble. I wonder if they would have stayed in the hardware business even if Hurd had stayed on. Maybe WebOS was not long for the company regardless of who was at the helm; its unequivocal success notwithstanding, maybe it'd have been abandoned anyway.
bugsy 2 days ago 0 replies      
I don't see what the big advantage is of CEOs few have heard of being dragged in to nuke existing companies and rebuild them in the image of the old company he used to work at. Why not just nuke the company out of spite (or alternatively sell off all assets and give to charity if not spiteful) and have the CEO stay at the old place and keep doing the same thing. Same outcome but not as painfully dragged out.

Obviously if you keep doing multi billion dollar acquisitions, and then trash it all and fire everyone a couple years later when you switch CEOS, and then when things get even worse, fire that CEO with a golden parachute and bring it his cousin to do it all again, pretty soon your company headquarters are going to be an abandoned grassy field.

Talk about burn rate!

nvictor 2 days ago 0 replies      
walmart - sold out

target - sold out

best buy - sold out

online hp shop - sold out

office depot - sold out

amazon - ripping off customers with original price

newegg - rip off also

where am i gonna get my touchpad future android mega device? ;(

thewileyone 1 day ago 0 replies      
Wow ... first time I'm agreeing with Gruber whole-heartedly.
buster 2 days ago 0 replies      
Atleast now HP now has a pretty valuable portfolio of patents of Palm
iand 2 days ago 1 reply      
He's never heard of Autonomy so therefore assumes it's similar to SAP? Hmmm.
dramaticus3 2 days ago 1 reply      
fireball is a visionary genius
anigbrowl 2 days ago 3 replies      
Autonomy is nothing like SAP. What a ridiculous, empty post.
What is in your .vimrc stackoverflow.com
190 points by nyellin  3 days ago   83 comments top 33
gbog 2 days ago 3 replies      
These vimrc posts are not always very useful. Actually, it should be reminded that each line in vimrc may have bad side-effects. It may increase Vim instance loading time, and increase editing footprint, or slowdown processes like highlighting and scrolling.

Moreover, and probably worse, each specific configuration increases the distance between your daily Vim-fu and the one you'll have to use on another user or another machine. And these occasions have been, in my experience, those when great Vim-fu was the most critical (eg. trying to keep your hand on a dying server on a flooded connection, or showing off your skills on your boss's Mac during plane trip). All this holds for bashrc too. The closest to the default is the best, to some extent.

I take vimrc posts as good occasions to proofread mine and remove all unused stuff. I just commented out a very weird "set notagbsearch" which was probably killing my <ctrl>-].

Pewpewarrows 3 days ago 1 reply      
Managed using the "homesick" command-line utility to propagate changes to all my working machines:


mitjak 3 days ago 1 reply      
I don't have much to add to the thread except for:

    set undofile

which will allow for persistent undo, i.e. undoing changes even after closing a file.

unfasten 3 days ago 0 replies      
Insert single characters: Press 's' in normal mode and the next character you type will be inserted at the cursor and put you back in normal. Press 'S' (Capital S or shift+s) and the character will be inserted after the cursor like 'a' append. This is also repeatable, so you can insert a character and then do '5.' to insert it 5 times, still leaving you in normal mode afterwards. Being repeatable is the reasoning I read for it being a function. I picked this up from the Vim wikia site awhile ago.

    " Insert single char (repeatable)
function! RepeatChar(char, count)
return repeat(a:char, a:count)
nnoremap <silent> s :<C-U>exec "normal i".RepeatChar(nr2char(getchar()), v:count1)<CR>
nnoremap <silent> S :<C-U>exec "normal a".RepeatChar(nr2char(getchar()), v:count1)<CR>

Press 'F5' to run the file you're editing, assuming it has a shebang.

    " Run current file if it has a shebang
function! <SID>CallInterpreter()
if match(getline(1), '^\#!') == 0
let l:interpreter = getline(1)[2:]
exec ("!".l:interpreter." %:p")
echohl ErrorMsg | echo "Err: No shebang present in file, canceling execution" | echohl None
map <F5> :call <SID>CallInterpreter()<CR>

I don't actually use this one a lot, but it can be handy. F10 to switch between the line numbering modes, in Vim versions that have relative line numbering (>= 7.3)

    " Toggle line numbering modes
" Default to relativenumber in newer vim, otherwise regular numbering
if v:version >= 703
set relativenumber
let s:relativenumber = 0
function! <SID>ToggleRelativeNumber()
if s:relativenumber == 0
set number
let s:relativenumber = 1
elseif s:relativenumber == 1
set relativenumber
let s:relativenumber = 2
set norelativenumber
let s:relativenumber = 0
map <silent><F10> :call <SID>ToggleRelativeNumber()<CR>
set number

joelthelion 2 days ago 1 reply      
It's a shame that SO doesn't allow these types of questions anymore. They are very useful for beginners who want to know how experienced users actually use the tool.

The fact that they cannot be answered objectively doesn't make them less useful, and contrary to what is stated in the FAQ, the question and answers model is perfectly suited to this type of question.

markbao 3 days ago 2 replies      
from those answers:

    nore ; :
nore , ;

Do this now. Probably not the vimrc line that has saved me the most time... but definitely saved me the most pinky pain.

_sh 3 days ago 0 replies      
I work with multiple files a lot, so I'm always navigating between split screens and across buffers.

  " Split windows/multiple files
" use <Ctrl>+s to split the current window
nmap <C-S> <C-W>s
" use <Ctrl>+j/<Ctrl>+k to move up/down through split windows
nmap <C-J> <C-W>j
nmap <C-K> <C-W>k
" use <Ctrl>+-/<Ctrl>+= to maximise/equalise the size of split windows
nmap <C--> <C-W>_
nmap <C-=> <C-W>=
" use <Ctrl>+h/<Ctrl>+l to move back/forth through files:
nmap <C-L> :next<CR>
nmap <C-H> :prev<CR>

Note these use the same 'hjkl' navigation keys.

fauziassegaff 3 days ago 0 replies      
generally speaking, this .vimrc is most core config file that had a most use for me, been messing around with it before, and finally i use janus carl and huda https://github.com/carlhuda/janus (had to thanks to them) for their distro of the .vimrc configuration, it include just all what i need for my macvim, it has a good plugins and configurations to where i can start of developments.

for others that don't want to mess around with vimrc configs (although its fun)just give it a shot and hopefully, and will happifly accept an contribution

git clone git://github.com/carlhuda/janus.git

(don't forget to rake it after)

viraptor 2 days ago 0 replies      
Interesting bits:

make sure I'm scrolling visual lines, not real lines

    noremap j gj
noremap k gk

ctrl+l/h for changing tabs

    noremap <C-l> gt
noremap <C-h> gT

Search improvements:

    set incsearch
set hlsearch

Best theme ever (very objective of course):

    let g:inkpot_black_background = 1
colors inkpot

Making sure tmp files are stored in only one location, not all around the system:

    if ! isdirectory(expand('~/vimtmp'))
call mkdir(expand('~/vimtmp'))
if isdirectory(expand('~/vimtmp'))
set directory=~/vimtmp
set directory=.,/var/tmp,/tmp

oinksoft 3 days ago 0 replies      
This is one of my favorite bits from my .vimrc. It lets you use !find with location list:

  function! g:Find(...)
let subexpr = 'substitute(v:val, ".*", "\"&\" 0: found", "")'
let found = join(map(split(system('find ' . join(a:000, ' ')), '\n'), subexpr), "\n")
exec 'lgete "' fnameescape(found) '" | lop'

command! -nargs=+ Find call g:Find(<f-args>)

The :Find command above passes its arguments to `find`.

I use splits heavily, and these mappings for navigating and resizing splits are indispensable:

  nnoremap <C-K> <C-W>k
nnoremap <C-J> <C-W>j
nnoremap <C-H> <C-W>h
nnoremap <C-L> <C-W>l
nnoremap _ 3<C-W><LT>
nnoremap + 3<C-W>>

duggan 3 days ago 3 replies      
I'm not sure how many man-hours were lost to fatfingering :wq as :Wq or :w as :W, but a simple alias has solved that particular bit of grief:

   cnoreabbrev Wq wq
cnoreabbrev W w

The rest of my .vimrc mostly belongs to the guy I caught the vim addiction from, but sets some useful defaults: https://github.com/duggan/dotfiles/blob/master/.vimrc

cpeterso 1 day ago 0 replies      

  " Automagically save files when focus is lost
autocmd BufLeave,FocusLost silent! wall

" Highlight whitespace at the end of a line
highlight ExtraWhitespace ctermbg=Black guibg=Black
match ExtraWhitespace /\s\+$/
autocmd BufWinEnter * match ExtraWhitespace /\s\+$/
autocmd InsertEnter * match ExtraWhitespace /\s\+\%#\@<!$/
autocmd InsertLeave * match ExtraWhitespace /\s\+$/
autocmd BufWinLeave * call clearmatches()

" Disable man key
nnoremap K <nop>

jonasb 2 days ago 0 replies      
The most important thing I've learnt recently regarding Vim config is Pathogen. https://github.com/tpope/vim-pathogen

With it it's much easier to keep plugins separate and encourages putting your own tweaks in custom plugins.

marshray 3 days ago 0 replies      

  "	Shift-Alt-S    -- (C++) - change the current    word/identifier in a quoted
" string to an ostream expression.
" For example, put the cursor on on the 'xxx' in:
" cout << "value = xxx\n";
" hit Shift-Alt-S and it changes to:
" cout << "value = " << xxx << "\n";
inoremap <S-A-s> <Esc>lbdei" << <Esc>pa << "<Esc>bb
inoremap ^[S <Esc>lbdei" << <Esc>pa << "<Esc>bb
noremap <S-A-s> lbdei" << <Esc>pa << "<Esc>bb
noremap ^[S lbdei" << <Esc>pa << "<Esc>bb
onoremap <S-A-s> <C-c>lbdei" << <Esc>pa << "<Esc>bb
onoremap ^[S <C-c>lbdei" << <Esc>pa << "<Esc>bb

ashley_woodard 2 days ago 1 reply      
This is an ugly hack I came up with to layout my windows how I like them. I have NERDTree and Taglist in a horizontally split window to the left and MiniBufExplorer across the top of the screen. See http://yfrog.com/h7sg7fp

  autocmd VimEnter * call<SID>LayoutWindows()

function! s:LayoutWindows()
execute 'NERDTree'
let nerdtree_buffer = bufnr(t:NERDTreeBufName)
execute 'wincmd q'
execute 'Tlist'
execute 'wincmd h'
execute 'split'
execute 'b' . nerdtree_buffer

let mbe_window = bufwinnr("-MiniBufExplorer-")
if mbe_window != -1
execute mbe_window . "wincmd w"
execute 'wincmd K'

execute 'resize +20'
execute 'wincmd l'

gcr 3 days ago 2 replies      

  nnoremap \ta <Esc>:tab ball<CR>

Now you can run `vim foo bar baz` and then when open just type `\ta` and it will open them cleanly in three different tabs. Why they renamed a command "tab ball" I will never know.

marshray 3 days ago 0 replies      
I made a little convention of marking 's' and 'd' as the top and bottom of a range of lines. Then I define several handy utilities like:

  "	Shift-Alt-Z    #-comment range 's,'d
inoremap <S-A-z> <Esc>:'s,'ds/^/#/g<CR>:noh<CR>
inoremap ^[Z <Esc>:'s,'ds/^/#/g<CR>:noh<CR>
noremap <S-A-z> :'s,'ds/^/#/g<CR>:noh<CR>
noremap ^[Z :'s,'ds/^/#/g <CR>:noh<CR>
onoremap <S-A-z> <C-c>:'s,'ds/^/#/g<CR>:noh<CR>
onoremap ^[Z <C-c>:'s,'ds/^/#/g<CR>:noh<CR>

sliverstorm 3 days ago 2 replies      
I go with whatever the default is. I've logged on to hundreds of *nix machines in just the past few years, and it's completely not worth the effort to try and maintain a concurrent configuration.
dfranke 3 days ago 0 replies      

  dfranke@ancalagon:~$ ls ~/.vimrc
ls: cannot access /home/dfranke/.vimrc: No such file or directory

jonathanwallace 3 days ago 0 replies      
I forked a great vim_config for ruby/rails coding and made a few tweaks of my own.


mun2mun 2 days ago 0 replies      
My favourite two lines (found in another .vimrc long time ago).

   set switchbuf=useopen,usetab

Files opened from buffer if exists. Handy for command-t plugin.

   autocmd BufReadPost * normal `"

Remembers the cursor position of files.

amix 3 days ago 0 replies      
jedberg 3 days ago 2 replies      
marshray 3 days ago 2 replies      
I map semicolon to <Esc>, and ctrl-l to insert a semicolon in insert mode.

<Esc> is one of the most frequent commands, no reason it should be on one of the farthest keys.

IznastY 3 days ago 1 reply      
imap jj <Esc>
LeafStorm 3 days ago 4 replies      
While the flexibility and portability of Vim is quite attractive, I doubt I could really retrain myself to use the modal interface. Are there packages/scripts/whatever that would allow one to use Vim in the way that one would use a more "normal" text editor?
pointyhat 2 days ago 0 replies      
Cobwebs: syntax on; set ts=4; set sw=4; set ai;

Keep it simple :)

james2vegas 3 days ago 0 replies      
I don't have one, I have a .nexrc
ConceitedCode 3 days ago 0 replies      
There are some invaluable little snippets in there.
HN now comes with HTTPS? ycombinator.com
178 points by mike-cardwell  2 days ago   78 comments top 13
mike-cardwell 2 days ago 2 replies      
The Firefox addon HTTPS Finder just alerted me to the fact that there was an HTTPS version of the site at https://news.ycombinator.com/ - I tried it out, and it worked. Nice work.

EDIT: Session cookie needs to be set as "secure" and Strict-Transport-Security should be implemented in order to protect against certain attacks. End users can just add this HTTPS-Everywhere ruleset:


yahelc 2 days ago 0 replies      
So, now the HN effect will be less measurable, as traffic from HTTPS to HTTP doesn't pass a referrer.

(This just increases the incentive for sites to use HTTPS Everywhere, so they're not left out in the dark as to who is sending them traffic.)

Joakal 2 days ago 2 replies      
Nifty report of HN's HTTPS: https://www.ssllabs.com/ssldb/analyze.html?d=news.ycombinato... Grade C it seems
st3fan 2 days ago 2 replies      
This is great. It would be even better if HN would use the Strict-Transport-Security header so that browser remember to prefer https instead of http for this site.

See http://blog.sidstamm.com/2010/08/http-strict-transport-secur...

wallflower 2 days ago 0 replies      
Thank you, RTM and PG!!
charlieok 2 days ago 0 replies      
Glad to see this trend. How about using https links in the RSS feeds, so that anyone coming in from their feedreader gets https by default?
pieter 2 days ago 1 reply      
For some reason it's now impossible for me to connect over normal http:

    manila:pieter$ curl -I http://news.ycombinator.com
curl: (52) Empty reply from server
manila:pieter$ curl -I https://news.ycombinator.com
HTTP/1.1 200 OK
Date: Sun, 21 Aug 2011 15:09:27 GMT
Content-Type: text/html; charset=utf-8
Cache-Control: private

jorde 2 days ago 0 replies      
Nice addition to HN. We also updated Chrome's Readable HN to work with HTTPS https://chrome.google.com/webstore/detail/jpnbjaechgbbpokepg... and if you would like to contribute https://github.com/jorde/readable-hn
deno 2 days ago 0 replies      
As a side-effect HN has upgraded from HTTP 1.0 to HTTP 1.1.
brackin 2 days ago 1 reply      
This is great, thanks PG! Only problem for me is that the theme i'm running will no longer work but that's okay, i'm sure they'll update.
MikeCapone 2 days ago 2 replies      
Is it possible for HN to use Google's SPDY protocol for better performance?
pclark 2 days ago 2 replies      
Why do people want https on Hacker News?
k33n 2 days ago 2 replies      
Seems like overkill to me.
       cached 24 August 2011 04:11:01 GMT