hacker news with inline top comments    .. more ..    13 Jul 2012 News
home   ask   best   7 years ago   
1
Scaling lessons learned at Dropbox, part 1 eranki.tumblr.com
151 points by eranki  3 hours ago   25 comments top 9
1
brc 1 hour ago 3 replies      
The idea of running extra load - it sounds good in theory but I can't help thinking that it's a bit like setting your watch forwards to try and stop being late for things. Eventually you know your watch is 5 minutes fast so start compensating for it. I wonder if this strategy starts to have the same effect - putting fixes off because you know you can pull the extra load before it becomes critical. In the same way you leave for the train a couple of minutes later because you know your watch is actually running fast.
2
akent 1 hour ago 1 reply      
I noticed that a particular “FUUUCCKKKKKasdjkfnff” wasn't getting printed where it should have

Why not take the extra half a second to make those random strings meaningful and hidden behind a DEBUG log level?

3
prayag 2 hours ago 2 replies      
Fabulous post. Thanks for writing.

One point it misses though is to test your backup strategy often. When you scale fast things break very often and it's good to be in practice of restoring from backups every now and then.

4
acslater00 2 hours ago 1 reply      
For the record, I use sqlalchemy 0.6.6 regularly under fairly heavy load, and have never had a problem with it. Any 'sqlalchemy bugs' are inevitably coding mistakes on my part.
5
jgannonjr 1 hour ago 1 reply      
Great post, but this part scares me a bit...

I think a lot of services (even banks!) have serious security problems and seem to be able to weather a small PR storm. So figure it out if it really is important to you (are you worth hacking? do you actually care if you're hacked? is it worth the engineering or product cost?) before you go and lock down everything.

Just because you can "afford" to be hacked, doesn't mean you shouldn't take all the steps necessary to proactively protect your data. In the end, security is not about you, it is about your users. This is exactly the type of attitude that leads to all the massive breaches we have been seeing recently. Sure your company is "hurt" with bad PR, but really your users are the ones who are the real victims. You should consider their risk (especially with something as sensitive as people's files!) before you consider your own company's well being.

Edit: formatting

6
nl 1 hour ago 1 reply      
I wish he'd left the security advice out.

The whole post was excellent, but all the useful points will now be overshadowed by the armchair quarterbacking about security by people who mostly don't understand that ALL security is a compromise, and it is as important to understand and make deliberate decisions about your security as it is to try to make a secure system in the first place.

7
ivankirigin 2 hours ago 1 reply      
Rajiv is awesome, you should listen to him
8
philfreo 2 hours ago 1 reply      
Can you explain the nginx/HAproxy config a little more?
9
matt 2 hours ago 0 replies      
Nice, love the idea of running with extra load to predict breaking points.
3
Rust 0.3 released github.com
56 points by eslaught  2 hours ago   8 comments top 6
1
zem 57 minutes ago 0 replies      
i love that they're thinking of syntax sugar right in the early stages. a lot of the "language semantics are all that matters" crowd scoff at sugar, but i think they underestimate how much syntax contributes to the pleasantness of using the language.
2
eslaught 2 hours ago 1 reply      
3
lbotos 26 minutes ago 1 reply      
Does anyone know the origin of the name? I did a quick look on wikipedia/github/project site but couldn't find anything conclusive.
4
btipling 46 minutes ago 0 replies      
Looking forward to 1.0. The #rust IRC channel on irc.mozilla.org is full of activity.
5
brson 1 hour ago 0 replies      
6
fzzzy 1 hour ago 0 replies      
Yay! Congrats guys!
5
EpicEditor: An Embeddable JavaScript Markdown Editor github.com
53 points by mars  2 hours ago   10 comments top 6
1
cwilson 1 hour ago 2 replies      
I'd love to implement this instead of a WYSIWYG into our platform, but the problem is our users (non-technical) are not going to take the time to learn markdown. Even if we had a guide on the right I have a hunch they would still be super confused.

Idea: Add buttons to the top (optional) that LOOK like a WYSIWYG (something they are familiar with), that simply apply markdown around text.

I think you'd see such much wider adoption with that addition, which is something I'd love to happen, because WYSIWYG's do indeed suck.

2
guptaneil 1 hour ago 0 replies      
This looks cool, but unfortunately does not work well on touch devices. It looks like the preview and maximize buttons depend on the hover event.
3
DanielRibeiro 2 hours ago 0 replies      
Hint: the maximize button splits the screen, and giving you editor/look view (like coffeescript site does for coffee/js).
4
dfischer 1 hour ago 1 reply      
I think it would be better if it offered some way to see a cheat sheet. I don't see that in the initial examples and too lazy to investigate.
5
pstuart 2 hours ago 1 reply      
Nice. MultiMarkdown would be even cooler!
6
president 1 hour ago 0 replies      
This is brilliant.
6
Facebook Monitors Your Chats for Criminal Activity mashable.com
48 points by adventureful  2 hours ago   37 comments top 13
1
sriramk 1 hour ago 2 replies      
This actually tripped up a friend of mine a couple of years ago. She left a comment on a photo of someone holding a toy gun saying "You look like <insert-name-of-well-known-terrorist>" followed by a smiley. Within hours, she got a message and a phone call from someone claiming to be working for FB's security who asked her some basic questions on why she left that comment. The whole experience scared her from using FB for a long time.

I thought the whole thing was adhoc and confusing. Anyone who saw the comment could easily see that it was a joke. Also, if it wasn't a joke, why is FB calling her and not someone from law enforcement?

Would love it if someone from FB here on HN could comment.

2
rationalbeats 1 hour ago 3 replies      
I'm not a criminal, I am a a pretty mundane guy actually, but of course we live in a society that every single one of us breaks some small law every day.

Which is why I stopped using Facebook.

I also stopped using Twitter to tweet. I still use it to follow news sources, I just don't actively tweet. I did that after the NYPD won a court case to see all the private messages you send on Twitter.

I also don't comment much at all on blogs, and social sites like this one or Reddit anymore. (I use to be a top 10 contributor over at Reddit. At least that is what some metric said a few years ago when someone listed the top ten most popular usernames. That account is deleted now)

I am slowly pulling out. I have a deep distrust of the current surveillance state in the United States. I remember reading a story about a guy who posted a quote from fight club on his Facebook status and a few hours later in the middle of the night the NYPD was busting in his door and he spent 3 years in legal limbo over it. (Might have been NJ police anyways, red flags)

You start piecing together these things, and you start to realize that your thoughts and ruminations about life, the universe, and the mundane, can be used against you at any moment and can completely strip you of your liberty and freedom, and any happiness you may have had.

I am gonna be completely honest, I am scared to express myself any longer on the Internet in any fashion. I don't trust it any longer. I don't trust the police, I don't trust the FBI, I don't trust the federal government, and I also don't trust, nor have faith, in the justice system in the United States.

3
stfu 1 hour ago 1 reply      
What I find interesting is, that now the but think of the innocent children argument is also getting adapted by the corporate world to justify incredible privacy invasions.

Facebook's mass wiretapping and analysis of its users private communication seems almost like the post office scanning each and ever letter and postcard in the vague hope of finding some keywords related to bomb, terror and of cause "children". I wonder how long it is going to take until Google is going to send automated notifications to my local police station when I'm going to start googeling some water bomb tutorials for the summer.

4
zethraeus 1 hour ago 1 reply      
The Mashable article seems to be sources from a Reuters article.
http://www.reuters.com/article/2012/07/12/us-usa-internet-pr... The program does appear to focus on sexual predators.

Mashable quotes Facebook as stating “where appropriate and to the extent required by law to ensure the safety of the people who use Facebook"

Can anyone speak to whether or not proactive scanning could possible be required by law? It seems entirely unlikely, but IANAL.

5
lignuist 1 hour ago 1 reply      
Someone should monitor Facebook for criminal activity.
6
chrsstrm 1 hour ago 0 replies      
So what happens if Facebook's system flags a message, it is reviewed by their staff and then dismissed as non-actionable, but turns out to be the precursor to a severe criminal act? Does the blame come back on Facebook for failing to prevent this crime?
7
chrisballinger 1 hour ago 1 reply      
All the more reason to use encryption technology like Off-the-Record (OTR) Messaging (http://www.cypherpunks.ca/otr)! I've been working on an OTR-compatible iOS app called ChatSecure (https://chatsecure.org) that is capable of encrypting your Facebook chats (or any other XMPP service).
8
fl3tch 1 hour ago 1 reply      
This looks like it's mostly targeted at sex predators, but I wonder if the system is also activated if you jokingly tell a friend that they are "smoking crack".
9
Zenst 1 hour ago 0 replies      
FB has a terrible reputation with regards to privacy without real justification.

This is not supprising in any way.

If you don't like this then don't do FaceBook - realy that easy I have found.

10
katbyte 51 minutes ago 0 replies      
Is there any way to easily and securely encrypt Facebook chats? a quick google finds:

http://abine.com/facebook.php

11
Zenst 1 hour ago 0 replies      
Whilst FB have legal obligations in many countries I must say when I read "phrases that signal something might be amiss, such as an exchange of personal information or vulgar language" then the first thing that sprang to mind was nothing to do with crime. People swear, people exchange details. SO I guess alot gets flagged up to there staff.

Question is, do they warn you that your private conversation is not private and do they comply with the data protection acts the various countries have and more importantly who monitors FB? So many things can be taken out of context and acted upon in good faith at the detrement of innocent parties, this is concerning. But I don't do FB, nor do I have any immediate plans either. That has nothing to do with this, but more todo with concerns in general about there privacy and policeys they act out.

12
neo1001 1 hour ago 1 reply      
I once made a joke on fb on a friend and posted "you smoke doobies" and he was so shit scared that he took it down. Lol
13
Zenst 58 minutes ago 2 replies      
What if Facebook make a mistake, do they get done for wasting police time? Monitoring is all fine but it needs to be done independantly, anything else is a conflict of interest and something that FaceBook staff can abuse.

You know it would not supprise me one bit if FaceBook had staff monitoring this modding down every post that holds them in true^H^H^H^HBAD light.

7
UK anti-encryption law falkvinge.net
279 points by timf  10 hours ago   162 comments top 24
1
nathan_long 9 hours ago  replies      
His argument is: 1) They can lock you up for refusing to decrypt something. 2) Encrypted data looks exactly like random noise. 3) Encrypted data can be hidden in any file. 4) Therefore, they can allege that nearly anything is encrypted and lock you up on that basis.

I'd say that's terrifying.

Another thought: doesn't this make it possible to frame someone by writing random data to their hard drive?

2
16s 9 hours ago 2 replies      
It is impossible to prove a PRNG'ed file is or is not encrypted data. TrueCrypt volumes look identical to `dd if=/dev/urandom of=file.bin bs=512`. Create a few of each and then evaluate them using ent to see this for yourself.

Edit: Link to ent http://www.fourmilab.ch/random/

You could prove the file is encrypted if it is indeed encrypted and you have the passphrase and the program to decrypt it, but outside of that, it's simply not possible to say with any level of confidence that the bits are really encrypted.

BTW, I wrote TCHunt in 2007, a program that attempts to seek out encrypted TrueCrypt volumes and I have a FAQ that covers much of this. Here's the link for anyone interested in reading more about it: http://16s.us/TCHunt/

And, there is usually much more to it than randomish bits in a file on a disk. The government agents usually have other evidence that suggests the person in question is doing illegal things and may have cause to use encryption. Finding actual encrypted data is normally just icing on the cake to them.

3
mootothemax 9 hours ago 1 reply      
In the section of the act mentioned (Regulation of Investigatory Powers Act 2000, part III), two of the defined terms are:

“key”, in relation to any electronic data, means any key, code, password, algorithm or other data the use of which (with or without other keys)"

(a)allows access to the electronic data, or

(b)facilitates the putting of the data into an intelligible form;

-- and --

“protected information” means any electronic data which, without the key to the data"

(a)cannot, or cannot readily, be accessed, or

(b)cannot, or cannot readily, be put into an intelligible form;

http://www.legislation.gov.uk/ukpga/2000/23/part/III

At first, I thought the argument in this article was nonsense. However, whilst I'd hope common sense would prevail, the definitions above seem broad enough that a policeman could make one's life difficult for a while.

4
shill 9 hours ago 0 replies      
Every digital storage device on earth should contain a randomly sized random data file called RANDOM-DATA. The user of said device could optionally replace this file with encrypted data. Once critical mass is achieved, states that do not respect individual liberty would have no way of determining the nature of every RANDOM-DATA file that they obtain by eavesdropping, theft or force.

I know the answer to this is 'easier said that done'. Certainly hardware and OS vendors can't be trusted with this task. Maybe FOSS installers could educate users and optionally create the file? How can we make this happen? I want to wear a t-shirt that says 'random numbers save lives.'

5
SEMW 9 hours ago 2 replies      
While it is obviously a bad law, it's not quite as bad as he's making out.

s.53(3):

"For the purposes of this section a person shall be taken to have shown that he was not in possession of a key to protected information at a particular time if"

(a) sufficient evidence of that fact is adduced to raise an issue with respect to it; and

(b) the contrary is not proved beyond a reasonable doubt."

In other words, if there's evidence for there to be 'an issue' about whether you actually do have a key (or whether e.g. it's just random noise), it's up to the prosecution to prove beyond reasonable doubt that it is actually data, and you do have the key.

So the flowchart is:

- If the police can prove they have reasonable grounds to believe that something is encrypted data that you have the key to, then

- That raises an evidential presumption that you do have it, which you can rebut by

- adducing evidence that just has to raise an issue about whether you have a key (inc. whether it's encrypted data at all), in which case the police have to

- Prove beyond reasonable doubt that it is encrypted, and you do have the key.

(IANAL)

6
shocks 6 hours ago 1 reply      
Hidden volumes.

Volume one contains hardcore porn, volume two contains bank job plans. Neither can be proved to exist with their keys.

When asked, hand over the porn keys. Plausible deniability.

7
switch007 1 hour ago 0 replies      
It makes me really angry seeing protests about laws which have already passed! It seems to be lazy journalism - after Liberty et al have done the hard work while the bill passes through parliamentary stages, once it's passed, traditional media and others pick up on it and start complaining.

Prevention is better than ranting after it's set in stone.

8
freehunter 9 hours ago 7 replies      
I have to wonder if this would ever hold up in court. I don't know much about the UK justice system, but in America it would be pretty rare to be convicted of a crime that they can't actually prove you committed. You could be jailed for refusing to comply with a court order to decrypt the file, but if you can prove it's not actually encrypted, they can't do anything about it.
9
jakeonthemove 9 hours ago 3 replies      
Damn, the UK is pretty f'ed up - the list of things that British citizens can't enjoy compared to a lot of other countries (even developing ones) is growing every day.

Meanwhile, a criminal could easily just store everything on an encrypted microSD card, then eat it if anything goes wrong - the oldest trick in the book still works in the digital age :-D...

10
MRonney 7 hours ago 0 replies      
I was watching 'Garrow's Law' yesterday. He said that "Laws which are passed in times of fear, are rarely removed from the statute books". Terrorists always win, because every time they attempt to strike the Government removes our basic liberties under the guise of protecting us.
11
jiggy2011 5 hours ago 1 reply      
Assuming this article is true (which I am pretty skeptical of, I live in the UK and never hear about people being jailed for not giving up an encryption key).

What would happen if there is encrypted data on your system but you didn't set the key yourself? For example DRM systems usually work by encrypting data and trying their best to make sure you never acquire the key.

12
mistercow 8 hours ago 0 replies      
>Yes, this is where the hairs rise on our arms: if you have a recorded file with radio noise from the local telescope that you use for generation of random numbers, and the police asks you to produce the decryption key to show them the three documents inside the encrypted container that your radio noise looks like, you will be sent to jail for up to five years for your inability to produce the imagined documents.

Of course, if you have access to the files, you could just XOR the noise with some innocuous documents, and send the result to the police saying it's a one-time-pad.

13
zaroth 3 hours ago 0 replies      
Can you say, "Who is John Galt?"

Eventually the preposterous laws drive those with mobility to simply leave. Follow that to it's logical conclusion; the UK will make it difficult to impossible to leave with your assets intact. Loss of privacy is a just a precursor to loss of private property altogether.

14
theaeolist 9 hours ago 2 replies      
Isn't TrueCrypt's 'hidden volume' feature enough to make this law pointless? Just have two encoded sets of information in the same file. When you are asked to give the key it is up to you the key of which one you give.

http://www.truecrypt.org/docs/?s=plausible-deniability

15
Albuca 9 hours ago 0 replies      
This reminds me of this American Case:

http://www.wired.com/threatlevel/2012/02/forgotten-password/

But on the whole, the whole article is scary and slightly unsettling. On the upside I dont live in the UK - But if we were to be traveling through the UK with our encrypted HardDrives, would we be targeted by the law?

16
epo 6 hours ago 0 replies      
This article is paranoid ill informed speculation, as are many of the Brit-bashing comments. The police have to show a judge they have good grounds to believe you are concealing evidence from them. Note also that if the powers that be are really determined to stitch you up then they will plant data on you, much simpler.
17
vy8vWJlco 7 hours ago 0 replies      
We are have begun to outlaw privacy. This is wrong. Speak up, while you still have a voice.

http://archive.org/details/the_hangman_1964
https://www.youtube.com/watch?v=keZlextkcDI
https://en.wikipedia.org/wiki/The_Drumhead

18
baby 7 hours ago 0 replies      
A scary article that forgot already many "stupid" or "vague" laws exist and are never used or always used in the right context.
19
ivanmilles 9 hours ago 1 reply      
So, now Random actually /is/ Resistance?
http://www.youtube.com/watch?v=aE6RtzwVdHI
20
Feoj 6 hours ago 0 replies      
How does/would this affect Freenet users? As far as I know, a Freenet user's 'deniability' claim comes from the idea that the user does not know the key to the encrypted content hosted on their machine.
21
antoinevg 9 hours ago 1 reply      
Roll on dual encryption. One key renders a dissertation on kittens, the other renders the original clear-text. Next problem?
22
short_circut 5 hours ago 0 replies      
So does this imply that I could go to prison for having an executable file presuming I can't "decrypt" it back into its original source code?
23
Zenst 10 hours ago 3 replies      
I stand by my argument that you can have a encryption key that is say 2000 characters long. Print it out 1 character per page and submit that in advance at your local police station, getting a receipt. You are then within the law.

Now question is - compression can be views as encryption. How does that pan out if you use a non-standard form of compression that does not require a key as the compression formula is the key in itself!

24
adamt 9 hours ago 4 replies      
I don't like or support the legislation - but I think this is a bit of an over-reaction.

The law as I understand it says that if you've got data (and the context of the law is in focussed primarily on targeting terrorism, child-porn etc) that you've encrypted but refuse to give over the encryption keys to; then if the police then convince a judge that there is valuable evidence in the encrypted data, and you still refuse, then you could ultimately go to prison.

Is this really any different to a digital search warrant?

Sure this law, like many others, could be abused. But I don't see it as anything to get to wound up about.

P.s. what kind of person has a 32GB file of satellite noise to generate random numbers with?!

8
Betaworks to Pay $500,000 for Fallen Social Media Star Digg wsj.com
65 points by uptown  4 hours ago   43 comments top 13
1
marknutter 3 hours ago 2 replies      
This is a case of a company having delusions of grandeur. Instead of humming along with 10 or so employees raking in cash hand-over fist, they tried to grow as fast and as large as possible to try to revolutionize news. Instead, Reddit did the former and is stable and profitable; it never pretends to be something it's not.

Not all companies are going to change the world, nor should they. I see Facebook and Twitter making the same mistakes. Perhaps Facebook is a great place to keep in touch with friends but nothing more. Perhaps Twitter is simply a great replacement for blogging, nothing more. Instead of filling their niches and quietly making a small number of employees very wealthy, they try to become these massive institutions that are reliant on very fickle user bases. It's a house of cards.

2
rmason 1 hour ago 2 replies      
I mean no lack of respect for Kevin Rose but I've got to sit and reflect on the differences between Digg and Reddit now that we know who won. I remember one of the Reddit founders saying once not to focus on your competitors but to just put your shoulder down and go.

So I have to contrast this Business Week cover:

http://valleywag.com/assets/resources/2006/08/kevin-cover-bu...

To this early 2005 interview with the founders of Reddit. They were so humble (it wasn't even our idea) and how they lived and breathed the site, even waking up every two hours to check it:

http://www.youtube.com/watch?v=5rZ8f3Bx6Po

Kudos to the Reddit founders but the real lesson is that their sacrifice built a better, stronger community.

3
programminggeek 4 hours ago 3 replies      
Remember when Digg raised $45 million and at one point Google was going to buy them for $200 million?

I guess sometimes you should just take the money and run.

4
brianlash 4 hours ago 3 replies      
That strikes me as shockingly low.

For context, KillerStartups bought the domain "startups.com" 4 years ago for $500,000. No IP, no brand recognition. The domain.

5
ajacksified 3 hours ago 1 reply      
Oh, WSJ, you never cease to make me giggle.

> outmaneuvered by rivals like Facebook Inc. FB -0.52% and Twitter Inc

Or, you know, Reddit, its direct competitor who mopped up much of Digg's userbase.

6
ojbyrne 3 hours ago 0 replies      
I'd make a vague guess that the reason the price is so low is that betaworks assumed some debts.
7
Osiris 3 hours ago 2 replies      
I loved Digg when it was v1 and just a technology news site. They always had great content that kept me up to date on the newest tech articles. Then they started adding other news categories and the front page became a bunch of random news articles that mostly didn't interest me.

Even if you went to the "Technology" page, it would only show those articles that had made it to the front page, meaning there'd be less than 10 new articles on that page per day.

Then it just started become spam with spammers using huge groups of users to upvote bad content like infographs.

I've visited it a few times in the past few months just to see what was going on and it's a really, really bad user experience right now. Really just awful.

8
hastur 3 hours ago 1 reply      
There's only one appropriate comment that sums up both this news item and the state of Digg in general:

Who cares.

9
kmfrk 3 hours ago 0 replies      
Color (.com) paid $350,000 for the domain name alone. Silicon Valley is funny like that.
10
hornokplease 3 hours ago 0 replies      
TechCrunch is now disputing the $500,000 price in their coverage[1]

One source close to the negotiations tells us that the price was indeed not $500k, but we haven't been able to pinpoint an exact price yet.

[1] http://techcrunch.com/2012/07/12/betaworks-acquires-digg/

11
jmduke 4 hours ago 2 replies      
Digg, for all of its issues, still has 4M monthly uniques and an Alexa Rank of 191.

Compare this to 37signal's Sortfolio, which sold for $480,000 a few weeks back.

12
larrys 1 hour ago 0 replies      
"The price was only about $500,000, three people familiar with the matter said"

Shouldn't the headline have added the word, "rumored"? Or maybe a Betteridges law version "Digg to sell for only $500,000?"

13
oliwarner 3 hours ago 0 replies      
This valuation is way, way, way off. I mean, add a zero to the end and I still don't think you're close to a fair value.

Nice domain, tons of traffic, and a load of real users who you could keep around if you wanted to. Even if you wanted to strip-mine the domain out and dump the company, I can't imagine your costs are going to be anything significant.

Whoever's managing this deal at Digg's end is screwing it right up.

9
You Can Get a Better Job If You Just Ask hash-money.com
58 points by nhashem  4 hours ago   14 comments top 8
1
DanielRibeiro 3 hours ago 0 replies      
Many of this advice echoes The Passionate Programmer[1]. However, I believe it should also echo this one[2]: Don't waste your time in crappy startup jobs

Maybe more importantly, it should echo You Weren't Meant to Have a Boss[3].

An even more disruptive perspective actually came from Bret Victor[4]:

There are many ways to live your life. That's maybe the most important thing to realize in your life, that every aspect of your life is a choice. There are default choices. You can choose to sleepwalk through your life, and accept the path that is laid out for you. You can choose to accept the world as it is. But you don't have to. If there's something in the world that you feel is a wrong, and you have a vision for what a better world could be, you can find your guiding principle, and you can fight for a cause. So after this talk, I'd like you to take a little time, and think about what matters to you, what you believe in, and what you might fight for.

[1] http://pragprog.com/book/cfcar2/the-passionate-programmer

[2] http://michaelochurch.wordpress.com/2012/07/08/dont-waste-yo...

[3] http://www.paulgraham.com/boss.html

[4] http://gumption.typepad.com/blog/2012/03/principle-centered-...

2
jmduke 3 hours ago 1 reply      
Are you a software engineer at a company with a languishing flagship product where any attempt at innovation is prematurely killed in the name of optimizing quarterly profits?

Stuff like this bothers me and I think paints an unfair picture of a corporate environment. Corporations aren't some Disney dichotomy of either being incredibly encouraging of change or a supervillain who purposefully punishes developers in order to line their own wallet.

The reality, I think, is that the vast majority of companies are in the middle. Change is good; change for the sake of change often goes against business principles. Why is a company shunning your 'innovation'? Is it because its unfeasible, or costly? Does it not integrate well with the product line?

3
jere 2 hours ago 1 reply      
>Think of any reason why you're tolerating your current job and ask any prospective employer to beat it.

What about a great team, reasonable hours, lack of red tape, and flexibility in technical choices? All reasons I love my current job and things I don't think a potential employer would give an honest answer about anyway. That's the biggest friction, for me at least.

4
temphn 7 minutes ago 0 replies      
This is sort of reminiscent of those billboards proclaiming "Life is short. Get a divorce."

Go talk to some people who've made the leap to founding their own companies or going for that bigger position. The pressure increases and the race never ends. Sometimes it's nice to just be able to knock out code on interesting problems without constantly worrying about your next move. Engineer comp is going up, way up. I don't think the right next step is to turn engineers into hyperpolitical MBAs.

5
rheide 2 hours ago 0 replies      
It's just not true. Not for everyone. I've worked with people in the past who would shine in a new role, and I keep telling them to quit their job and move on to something better. But I've also worked with people who think they're good programmers but really aren't, and those people should thank the lord with both hands that they got the job they have now.
6
readme 2 hours ago 0 replies      
Alternate title: How to become delusional and get fired from your solid job that pays all your bills and allows you to save for retirement and invest, too.
7
Ralith 2 hours ago 0 replies      
This seems to be nothing more than a long-winded advertisement for Persway.me. Which doesn't even serve areas other than NYC and LA.
8
xonev 3 hours ago 2 replies      
Why persway.me and not persuade.me?
10
I Believe in Gittip gittip.com
46 points by jordanmessina  4 hours ago   25 comments top 4
1
kiba 2 hours ago 1 reply      
About doing what you really want and getting an income: sometime they don't mix well.

Right now, I am just building projects on whatever I felt like doing. Some are no doubt, useful to everybody else. However, the rest may be just be interesting only to me or solve a problem that is unique only to me.

Do I know where I will get my money? No clues. However, it been a blast for me, personally. I learn a few tech such as meteor.js and how to develop chrome extensions, although I really want greybeard knowledge(algorithms, debugging, and other core skills), not the latest fad.

2
chubot 1 hour ago 2 replies      
I like the idea of Gittip, and I have definitely thought about the problem of how society can encourage things that benefit it (i.e. open source software).

I had a bad experience with Gittip though. I tried to tip you. It said sign in with github, so I did. Then I go back. I tried to tip you again, and it said there was an error. Then I clicked "back with a credit card". And it says I need to sign in with github to add a credit card. But it already says I'm signed in as my user name in the upper right corner. So something is wrong.

Also I think it would be better to allow one-time payments too. I just wanted to try your system but I wasn't committed to making a recurring payment. I was going to cancel it after I tried it. I would have been fine with a one-time payment.

3
rpwilcox 2 hours ago 2 replies      
Eating your own dogfood: an excellent idea :)
4
wildmXranat 3 hours ago 1 reply      
So Gittip is a recurring donation platform for Github authors ?
12
A 12pt Font Should Be The Same Size Everywhere github.com
60 points by kickingvegas  5 hours ago   39 comments top 13
1
crazygringo 4 hours ago 3 replies      
Well yes, 12pt should be the same everywhere, but points are a terrible unit of measurement for anything computer-related.

Points are useful on paper only. They should only exist in the context of word processing and like, where we expect things to be printed in an actual physical size, with 72 points to an inch.

For computer interfaces or web documents, we just need some kind of measurement that is a relatively known proportion to the computer or browser's interface. Fortunately, CSS "px" does that rather well -- it doesn't indicate physical pixels, but rather logical pixels, and we all know what "16px" text looks like relative to our OS, and it works great. And happily, people rarely use "pt" in CSS, and in my opinion it should never have been an option in the first place.

2
gcb 4 hours ago 2 replies      
i was fighting with the same thing the other day... then i realized, i do not look at my 24" monitor at the same distance then i look at my 4" phone screen.

So, which size should the font be if i'm designing for both those screens? pt sure isn't the answer.

this being only tangentially related to the topic :)
so, back on topic, yes, i wholeheartedly agree that pt should mean what it mean. it's just retarded that it's not. and you are not even accounting for TVs which cut random portions of the displayed image for no reason, making this even harder to calculate the real dpi.

3
PaulHoule 4 hours ago 1 reply      
scary... my experience is that the more platforms try to get this kind of thing right, the more they get it wrong.

back in the 90's, photoshop would try to do all sorts of gamma correction on images you were editing with the consequence that, if you didn't turn it off and make sure it was always turned off, you'd get your colors wrong 100% of the time.

a system like that working requires that all the pieces be correctly configured, and the consequence is that instead of having something that's 4.1 inches on one platform and 5.2 on another, you have something that's 12.7 inches on one platform and 1.4 on another.

4
wmf 4 hours ago 2 replies      
You seem to be taking it as given that the size of stuff should match between different displays, but you don't give any reason why that's desirable. I understand the argument for 1:1 WYSIWYG in DTP, but not for UI. I haven't been doing much DTP in the last decade.

Some people prefer larger UI and some prefer smaller " partly due to differences in vision and partly just preference " and today's industry lets them choose. Your proposal takes away that choice and thus is guaranteed to anger people.

5
ChuckMcM 4 hours ago 1 reply      
It would be nice, won't happen of course but here is to hoping.

The challenge isn't typography, its people. People who want their document to be as wide as their phone on a phone and as wide as their tablet on a tablet, and not quite as wide as their screen on their desktop or laptop. If you force them tom compute their 'zoom factor' they get annoyed. People who give them what they want, get their business. And typography continues to suffer.

6
dpark 2 hours ago 0 replies      
This sounds like a good idea at first[1], but then you realize that text is often accompanied by graphics that are not vectorized. Resolution independence when coupled with raster images is a hard problem. You can render the text at the desired size fairly easily, but you can't resize a bitmap arbitrarily without it looking pretty terrible. This is why resolution independence hasn't happened for displays despite lots of attention. It's also why Apple just doubled everything to keep it simpler.

[1] Actually, it might not even seem like a great idea at first if your first thought is to contrast your phone screen and TV screen.

7
px1999 39 minutes ago 2 replies      
I don't think I see the point, aside from for historical reasons and printing (though that one's already been solved quite thoroughly...)

The beauty of electronic displays is that they can be tailored to your specific wants and needs. If I want to view your text, I should be able to view it at a size that's comfortable to me on whatever device I want to read it on. If I'm not reading it, and am instead waiting for updates, I should be able to scale it down and put it somewhere in my workspace that doesn't take up too much room or distract me.

To have an edict that says 12pt must always be 0.4233cm or 0.16777in (quite a ridiculous measurement) achieves nothing. What if I want to display the resource you built for display at 12pt on a 24 inch monitor? A 6 inch phone? A projector? What if I have bad vision and want it larger? Suddenly, 12pt needs to get multiplied by some arbitrary factor and it's lost all of its usefulness again.

My expedient excuse is that it's not necessary unless you're dealing with something static and physical.

8
wnoise 3 hours ago 1 reply      
No. I want to be able to make slides that work on projectors. The only unit able to handle this would be "angle based on standard viewing distance", which will require markedly different "actual lengths" for different devices.
9
drivebyacct2 3 hours ago 0 replies      
Huh? I sit at a different distance from my monitor than my phone is from my face. I buy high resolution monitors, not because I want the "OMG the circle is a smoother circle" sense of a Retina display but because I want greater density. If we're nitpicking about "pt" in particular, fine. But if we're saying we should design UI elements and text specifically so that it's always the same size, count me completely out.
10
teilo 1 hour ago 0 replies      
"Font-hinting becomes less necessary at 200 PPI and higher."

Umm, no. Font hinting is still important at 300 DPI, so it's also still important at 200 PPI.

11
brittohalloran 4 hours ago 1 reply      
Github repo as blog post?!?! Hows that for some transparency in your edits:

https://github.com/kickingvegas/12pt-should-be-the-same-ever...

12
xmmx 4 hours ago 1 reply      
This will lead to more harm than good. Sure, designers can use it to their advantage to create great pages, but some people who don't account for this will make their websites so their 12mm font looks great on their 4000px monitor but everything looks fuzzy on my 1024 px screen. And oh god the scrolling when I try to use my ipod touch to browse the site.
13
homer-simpson 3 hours ago 0 replies      
For your reading pleasure, here is (imho totally flawed) argument by one Mozilla guy who thinks otherwise:

http://robert.ocallahan.org/2010/01/css-absolute-length-unit...

13
Richard Posner: Why There Are Too Many Patents In America theatlantic.com
179 points by joshuahedlund  9 hours ago   62 comments top 11
1
grellas 3 hours ago 1 reply      
This piece amounts to a red alert signal from a distinguished judge to Congress that it needs to fix some pernicious elements of the U.S. patent system and that it needs to do so now. The tone is judicious but the message is essentially alarmist: the system is seriously out of whack and Congress needs to get on with fixing it.

Judge Posner admits he is no expert on what the fixes should be and his tentative suggestions for fixing the system are, in my view, decidedly mixed on their merits (e.g., specialized adjudications before the USPTO - remember when it was suggested that a specialized appeals court would improve the patent system and the result was a court that has been so maximalist in its approach to patents that it has in itself become a significant part of the problem).

So where to begin?

Legally, it has to go back to fundamentals and, for me, this has to go back to the scope of patentable subject matter and whether this should be defined to include software at all.

The Patent (and Copyright) Clause of the Constitution (Article I, sec. 8, cl. 8) provides that the Congress shall have the power "to promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries." Note that, in defining this as one of the enumerated powers of the federal legislative branch, the Constitution does not mandate that the legislature provide for patent protection of any sort. It merely permits the exercise of such a power within constitutionally prescribed limits. Thus, any legitimate exercise of patent authority in the U.S. must come from Congress and must respect the constitutional bounds that any grant of patents be for "limited times" and be done in such a way as "to promote the progress of science and useful arts." Legally, then, any patent system in the U.S., if adopted at all, must be authorized and defined by Congress with a view to promoting the progress of science and, implicitly, must employ "limited times" consistent with what it takes to promote scientific progress.

The first issue, then, is whether patents are needed at all to promote the progress of science. In the U.S., in spite of philosophical arguments to the contrary by Jefferson (http://news.ycombinator.com/item?id=1171754), this has never been seriously in dispute. The industrial revolution was already well in progress in 1789, when the Constitution was adopted, and the federal authority, though generally regarded with great wariness at the time, was seen as vital to protect the rights of inventors and to reward them with limited monopoly grants in order to encourage the progress of science. In the first U.S. Patent Act (Act of April 10, 1790, 1 Stat. 109, 110), Congress implemented its constitutional authority to sanction patent monopolies by defining patentable subject matter very broadly, to include "any useful art, manufacture, engine, machine, or device, or any improvement therein." Congress amended the Act in 1793 and then again in 1952, so that today it reads as to the idea of "patentable subject matter" as follows (35 U.S.C. sec. 101): "Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title."

Thus, patents in the U.S. can be granted for any original invention that fits within the definition of patentable subject matter and that also meets the other conditions of the patent act (i.e., that is useful and non-obvious). Note, though, that the 1952 definition of patentable subject matter significantly expanded the scope of such subject matter in the name of bringing the patent laws up to date with developments in then-modern technology, all in the name of promoting the progress of science. It did so by defining patentable subject matter to include any "new and useful process" as well as any "new and useful improvement" of any original invention. Over time, "process" has come to embrace business methods and also software. And the protection of "useful improvements" made clear that new uses of existing machines or processes could be patented notwithstanding older Supreme Court decisions such as Roberts v. Ryer, 91 U.S. 150, 157 (1875) ("it is no new invention to use an old machine for a new purpose").

To promote the progress of science, then, Congress in 1952 allowed patents to be granted for any inventive process and for any inventive new use for any such process. In my view, this generally made sense for what was essentially the continued playing out of the same sort of industrial revolution that animated the original forms of patent protection granted in 1790. Looking at that language at that time, one could readily make the case that patentable processes and improvements thereon could and did promote the progress of science. Discrete inventions tended to be sharply differentiated and tended to involve significant development effort in time and resources. An inventor could keep a process secret and not patent it but the grant of a limited monopoly gave a decided inducement to disclose it to the world and, hence, to expand the broad pool of scientific know-how available to society.

Then came the digital revolution and, with software, a new or improved process can amount to an almost trivial variation on prior art amidst a seemingly endless stream of improvements developed in ever-rapid succession and with little or no capital investment beyond what developers would be motivated to do for reasons entirely independent of gaining monopoly protection for the fruits of their efforts. Moreover, there is little that is closed about such innovations: a wide knowledge base is in place, known to an international developer community that is basically scratching its collective head asking why it should be restricted legally from using techniques and processes that represent common knowledge in the field.

The main question, then, concerning software patents, is whether the existing framework makes sense as one that promotes the "progress of science" insofar as it grants patent protection to process inventions in this area. Congress needs to seriously ask itself that question. A second question, also tied to constitutional authority and assuming that it is legitimate to grant some form of patent for such inventions, is whether a 20-year period of exclusivity makes sense in an area where innovation occurs at blazing speeds and with not too much capital investment tied specifically to any given discrete invention. Is that necessary to promote the progress of science? That too is a question that Congress needs to consider.

Thus: (1) there is nothing magical about the current definition of patentable subject matter and Congress can adapt this to suit the needs of the time in promoting the progress of science, (2) process patents are in themselves a fairly recent phenomenon (at least in any large numbers) and it is no radical change to curtail them in areas where they make little or no sense in light of the constitutional purpose for why patents even exist in the first place, and (3) legitimate patent reform needs to go far beyond procedural fixes around the edges of the system and needs to focus on the realities of modern technology and whether the patent laws further or impede the progress of science as applied.

The policy debate can and will go all over the board on this but, if it is framed in light of the constitutional foundation for having patents in the first place, it can be shaped in a way that puts the focus on the fundamentals of what needs to be fixed as opposed to lesser issues that do not get to the heart of the problem. The main problem today is the blizzard of vague and often useless patents in the area of software. These are effectively choking all sorts of innovation and are benefiting mainly lawyers, trolls, and others who do not further technological development by what they do. It is a mistake, in my view, then, to swing too broadly in trying to fix things (as by advocating abolition of all patents) or to be so timid about the issues that reform is marginal at best and ineffective in dealing with the current crisis of an over-abundance of essentially worthless patents. Congress embraces the patent system as a whole and shows no hostility to its fundamentals. Reform must be shaped in light of those fundamentals but it must, at the same time, be meaningful to eliminate the main garbage from the current flawed system. Judge Posner has pointed the way generally and proponents of reform ought to follow his lead, with the focus being (in my view) on software.

2
cletus 9 hours ago  replies      
Posner may well be one of the most important figures in tech in the coming decade for standing up against the lunacy of software patents. What Congress and the President don't seem to understand is the cost of patent litigation in the US poses an existential threat to America's dominant position in tech.

One of the most compelling arguments to me (against these patents) is that in pharmaceuticals, for example, you are dealing with a handful of patents. Some processes might be patented, maybe even some equipment (easily licensed generally) but basically the patents that go into a process (that may itself be patented) are minimal and can be reasonably well understood by those running such businesses.

Posner pointed out that a smartphone may well contain (and violate) thousands of patents. That right there is a sure sign that something is rotten in the state of the patent system.

The solution here isn't reform, as some suggest (ie raising the bar to what's patentable). It's simply to get rid of them. First-to-market and execution are what matters and what should matter. 20 year exclusives for vaguely worded patents on things that are more often than not obvious is just a means for big companies to extinguish smaller companies.

3
mtgx 9 hours ago 3 replies      
"In most [industries], the cost of invention is low; or just being first confers a durable competitive advantage because consumers associate the inventing company's brand name with the product itself; or just being first gives the first company in the market a head start in reducing its costs as it becomes more experienced at producing and marketing the product; or the product will be superseded soon anyway, so there's no point to a patent monopoly that will last 20 years; or some or all of these factors are present. Most industries could get along fine without patent protection."

Wow, this guy really gets it. This is how markets and competition work. There's no need to give a company a legal monopoly. If anything, that lack of monopoly, will force companies to keep trying to invent new things to keep staying one step ahead of the competitors.

I also love this one:

"forbidding patent trolling by requiring the patentee to produce the patented invention within a specified period, or lose the patent"

These days big tech corporations are filing patents as fast as they can print them on paper. And then 95% of them will probably never be used in products that are shipping in the market.

4
jpadkins 8 hours ago 1 reply      
I used to be like Richard Posner, where I was generally against patents except for a few cases like pharmaceutical. Until I read Against Intellectual Monopoly. http://levine.sscnet.ucla.edu/general/intellectual/againstne...

See chapter 9 for an historical analysis of the pharmaceutical industry in countries without patents. The surprising result is companies no-patent protection countries were producing equivalent new drugs as the patent protected companies.

Now I am full anti-IP advocate, except for certain trademarks and attribution of authorship (so people know who the company/author this product came from).

5
WalterBright 5 hours ago 1 reply      
Before 1989 or so, software was assumed to be not patentable. This did not appear to slow down innovation or progress in software in the slightest.
6
vibrunazo 4 hours ago 0 replies      
I'm extremely skeptic of the proposed solutions. I haven't yet seen a solution that would be a clear net win for society after summing the pros vs cons. It seems to me that fixing patents is a mathematical impossibility. Trying to come up with a system that forces most inventors to pay a few inventors, while at the same time not punishing most inventors. Sounds like trying to come up with a number that is less than 2, while at the same time greater than 1000. It's mathematically impossible.

The optimistic in me would love to believe there's a brilliant solution, which is way over my head. The realistic in me, can only see paradoxes and no obvious solution. Maybe I'm just too dumb to solve this problem myself.

I believe the right path is to look back at what the vision behind patents are in the first place (incentives for invention), and think from the ground up how we can implement this without the modern "necessary" dogmas (such as licensing or IP). Then I can actually think of plenty of solutions. But none of them even remotely resembles what we know today as a patent.

7
guygurari 8 hours ago 0 replies      
"There are a variety of measures that could be taken to alleviate the problems I've described. They include: reducing the patent term for inventors in industries that do not have the peculiar characteristics of pharmaceuticals that I described; ..."

To me the obvious solution, and the one missing from this list, is to abolish patents altogether in such industries ( including the tech industry). I wonder if judge Posner would agree, and if so, why not come out and say it? Would this be considered too radical at this point in time?

8
kiba 8 hours ago 1 reply      
Whenever a congressman or members of an executive branch do something, I usually hate their gut.

Whenever a judge decide something, it usually make me like them.

In fact, Americans trust their judges more than their politicans and bureaucrat. http://www.gallup.com/poll/143225/trust-legislative-branch-f...

9
clarle 8 hours ago 0 replies      
Great points all around, but I don't necessarily agree 100% with his thoughts in the pharmaceutical industry.

For specific drugs, this may be the case, but when you have pharmaceutical companies doing things like patenting specific gene sequences, causing both other companies and academics to have to get licenses/permission just to perform research on something completely different, that's just ridiculous.

How patents work should be more flexible, and not limited to just whatever industry they're in.

10
keithpeter 7 hours ago 1 reply      
"...eliminating court trials including jury trials in patent cases by expanding the authority and procedures of the Patent and Trademark Office to make it the trier of patent cases, subject to limited appellate review in the courts..."

UK perspective: what do people here think of this suggestion, perhaps even as a temporary 'damper' on the patent troll business model? Raising the barrier to litigation would perhaps slow down the rate at which these cases occur. In the UK, we have a special court for trying IP cases, and the barrier to litigation is very high, perhaps too high for some small companies. Of course, the EU does not allow the granting of software patents.

11
Zenst 9 hours ago 5 replies      
What I don't understand and also feel is a issue with patents as a whole is you can patent something without actualy being able to show a working example/product.

For example :- somebody could patent teleportation - define it and then when somebody does all the hard work and actualy invents a teleporter, you are then able to cry patent violation and cash in. That too me is compeletely wrong, yet that is how the patent system stands currently.

I have also noted that alot of patents that have no working prototype or product, all seem to have been done in some SCI-FI movie/TV series priviously and find it somehow suprising that the movie industry have not started jumping on this patent bandwagon as they have more of a working prototype than many awarded patents that get approved in this day and age.

14
"Digg Overtakes Facebook with 1400% Growth, 22.6 Million Uniques" dcurt.is
20 points by hodgesmr  1 hour ago   13 comments top 10
1
vlad 7 minutes ago 0 replies      
Very misleading quote. The full article features disbelief about Compete.com numbers, so it seems as much a critique of Compete.com as it does a story about Digg.

Having read the entire article, I wonder if Compete counted Digg buttons on blogs as visits. Facebook, of course, did not even get a Share/Like button until 2.5 years later in late 2009. http://mashable.com/2009/10/26/facebook-share-buttons/

(It's interesting that recent articles have debated if Facebook itself now counts the display or click of a Like button as a visit.)

2
therealarmen 31 minutes ago 0 replies      
It's a fun exercise, but we all know how reliable Compete is for quantitative traffic comparisons. For example, Compete currently claims Reddit (http://siteanalytics.compete.com/reddit.com/) has less traffic than Digg (http://siteanalytics.compete.com/digg.com/).

I sincerely doubt Digg surpassed Facebook's traffic at any point in the company's lifetime.

3
jere 1 minute ago 0 replies      
Is anyone else hoping to read the same article in 5 years but about Facebook?
4
carsongross 36 minutes ago 0 replies      
It's a crazy world out there, kids:

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1850428

Try not to let it bother you too much.

5
dredmorbius 46 minutes ago 0 replies      
Sic transit gloria lacinia felis.
6
dennisgorelik 40 minutes ago 0 replies      
More recent data shows nosedive:

http://siteanalytics.compete.com/digg.com/

7
marcamillion 51 minutes ago 0 replies      
In the blink of an eye, in internet years.
8
sukuriant 47 minutes ago 2 replies      
Fascinating to see Digg coming back from the grave, so to speak. I didn't see this coming.

Er...
Fascinating to see such an old article come up on Hackernews without the age of the article in parenthesis.

That's a derp.

9
rdg 42 minutes ago 0 replies      
Really? In which parallel universe?
10
VictorZ 49 minutes ago 0 replies      
Go ahead. Rub it in.
16
Microsoft Officially Launches Its New Angel Fund And Incubator Program techcrunch.com
37 points by aaronbrethorst  4 hours ago   8 comments top 4
1
jf 12 minutes ago 0 replies      
The title to this post isn't exactly correct. From what I can tell, this is a program being run by one division of Microsoft (Bing) and not necessarily representative of Microsoft, the company.

(During the 90's people called Microsoft "the Borg", which does a disservice to how to company actually operates. Microsoft is actually more like the Ferengi, which is to say, an alliance of individuals obsessed with profit and trade who are known for their business acumen.)

2
Zenst 1 hour ago 2 replies      
To me it does seem more like a very cunning way to market there own products. Why dish out X amount on marketing when you can borg a startup into using your products.

Still it's an avenue that to many is a faustian deal, but it is still a option that many others will apprecieate - even if it does seem a way to promote bing and azure.

3
prayag 1 hour ago 1 reply      
> While we can't make any guarantees, acquisition is always a possibility.

I am not sure why this was made explicit? People shouldn't join their fund to get acquired by MS, they should aspire to build a long-standing business.

4
vyrotek 1 hour ago 0 replies      
I thought this was interesting. Apparently you don't need to move to the Seattle area.

https://www.bingfund.com/About

What if my company is not located in the Seattle area?

While we will be able to give more hands-on assistance to startups that are nearby, as long as your company is incorporated in the United States, you will get the same benefits, with the exception of co-workspace.

17
Vim, you complete me thoughtbot.com
137 points by Croaky  10 hours ago   76 comments top 13
1
Rudism 9 hours ago 6 replies      
It all started out innocently enough. You experimented with it once or twice in your first year of college, but Nano and Pico were easier"closer to what you had already been using during high school on the Windows machines and Macs. But as time went on and you got more experience under your belt in the college-level computer science courses, you started to notice something: All of the really great programmers"the kind who churned out 4 line solutions for an assignment that took you 10 pages of code to complete; the kind who produced ridiculously over-featured class projects in a day while you struggled with just the basics for weeks"none of them used Nano or Pico.

Staying late one night to finish an assignment that was due at midnight, you happened to catch a glimpse over one of the quiet uber-programmer's shoulders. Your eyes twinkled from the glow of rows upon rows of monitors in the darkened computer lab as you witnessed in awe the impossible patterns of code and text manipulation that flashed across the screen.

"How did you do that?" you asked, incredulous.

The pithy, monosyllabic answer uttered in response changed your life forever: "Vim."

At first you were frustrated a lot, and far less productive. Your browser history was essentially a full index to the online Vim documentation; your Nano and Pico-using friends thought you were insane; your Emacs using friends begged you to change your mind; you paid actual money for a laminated copy of a Vim cheat sheet for easy reference. Even after weeks of training, you still kept reaching for your mouse out of habit, then stopped with the realization that you'll have to hit the web yet again to learn the proper way to perform some mundane task that you never even had to think about before.

But as time went on, you struggled less and less. You aren't sure when it happened, but Vim stopped being a hindrance. Instead, it become something greater than you had anticipated. It wasn't a mere text editor with keyboard shortcuts anymore"it had become an extension of your body. Nay, an extension of your very essence as a programmer.

Editing source code alone now seemed an insufficient usage of Vim. You installed it on all of your machines at home and used it to write everything from emails to English papers. You installed a portable version along with a fine-tuned personalized .vimrc file onto a flash drive so that you could have Vim with you everywhere you went, keeping you company, comforting you, making you feel like you had a little piece of home in your pocket no matter where you were.

Vim entered every part of your online life. Unhappy with the meager offerings of ViewSourceWith, you quickly graduated to Vimperator, and then again to Pentadactyl. You used to just surf the web. Now you are the web. When you decided to write an iPhone application, the first thing you did was change XCode's default editor to MacVim. When you got a job working with .NET code, you immediately purchased a copy of ViEmu for Visual Studio (not satisfied with the offerings of its free cousin, VsVim).

Late one night, as you slaved away over your keyboard at your cubicle, working diligently to complete a project that was due the next morning, you laughed to yourself because you knew no ordinary programmer could complete the task at hand before the deadline. You recorded macros, you moved entire blocks of code with the flick of a finger, you filled dozens of registers, and you rewrote and refactored entire components without even glancing at your mouse. That's when you noticed the reflection in your monitor. A wide-eyed coworker looking over your shoulder. You paused briefly, to let him know that you were aware of his presence.

"How did you do that?" he asked, his voice filled with awe.

You smile, and prepare to utter the single word that changed your life. The word that, should your colleague choose to pursue it, will lead him down the same rabbit hole to a universe filled with infinite combinations of infinite possibilities to produce a form of hyper-efficiency previously attainable only in his wildest of dreams. He reminds you of yourself, standing in that darkened computer lab all those years ago, and you feel a tinge of excitement for him as you form the word.

"Vim."

:wq

2
cletus 6 hours ago 5 replies      
Vim reminds me of an example from Bret Victor's excellent talk Inventing on Principle [1] (which you should watch from beginning to end if you've somehow been hiding under a rock and have missed it).

Bret mentions Larry Tesla (starting at about 38:10) who made it his personal mission to eliminate modes from software.

This is the problem I've always had with Vim and I suspect I'm not alone in this. I find the concepts of modes jarring, even antiquated. Everyone who has used vi(m) has copied and pasted text in while in command mode and done who knows what.

Emacs is better in this regard but I find Emacs's need to consecutive key presses (or a key press followed by a command) to be longwinded.

The advantage of either is they're easy to use over ssh+screen (or tmux) for resuming sessions.

That all being said, give me a functional IDE any day. IDEs understand the language syntax. Vim can do a reasonable job of this. Emacs (with elisp) seems to do a better job (or so it appears; I'm no expert) but IDEs (my personal favourite being IntelliJ) just make everything easier. Things as simple as left-clicking on a method and going to its definition.

For statically typed languages (eg Java/C#), IDEs (quite rightly) rule supreme. Static code analysis, auto-completion, etc are just so much better than text-based editors.

Dynamic languages are more of a mixed bag and its certainly the norm for, say, Python and Ruby programmers to use one of these.

Still, give me an IDE any day.

One objection seems to be that people don't like using the mouse. I tend to think the speed differences over not using the mouse are largely illusory.

Anyway, I can use vim but I've never felt comfortable in it and I don't think I ever will and modes are the primary reason.

EDIT: I realize there are plugins and workarounds for many of these things but that's kinda the point: I don't want to spend hours/days/weeks/years messing with my config to get it "just right".

Also, instead of archaic commands to, say, find matching opening/closing braces (which tend to involve an imperfect understanding of the language), IntelliJ just highlights matching braces/parentheses, where a variable is used and so on.

[1[: http://www.youtube.com/watch?v=PUv66718DII

3
solutionyogi 9 hours ago 3 replies      
I have been using Vim for over 3 years and I didn't know about the whole line completion.

I think the best thing about Vim is that it will always keep surprising you. If only we could find a life partner like that, there won't be any divorces.

4
calinet6 7 hours ago 1 reply      
I understand Vim. Really, I do. I even know how to use it for the most part.

But I guess I just don't get it. It's too obtuse. I don't feel connected to my editing while using it, I feel... connected to Vim. Which, I think, might explain why others feel so connected to Vim too. They get attached to it because it's an investment, and as we know from various psych studies, we get attached to things we invest in.

This isn't a bad thing, just wanted to throw my 2¢ out there. Vim's a cool language for text editing, but it's not the only one.

5
pdeuchler 7 hours ago 7 replies      
This may just be me, but I feel that a lot of the productivity that Vim supposedly creates is a result of programmers spending lots of time learning one tool very well, as opposed to something inherent within Vim that automatically makes you more productive.

That's not to say Vim doesn't have an excellent ecosystem, and has tried and true ergonomic benefits, however I feel that if someone spent the same amount of time learning, say TextMate (just an example, may not be the best), they would be just as productive.

6
JangoSteve 5 hours ago 1 reply      
> For example, in HTML vim should know that you can't nest a div inside an a...

In HTML5, you can. See http://html5doctor.com/block-level-links-in-html-5/

7
jes5199 6 hours ago 1 reply      
There's something weird about the ^P/^N menu - I always have trouble telling which possible completion is selected in the list, and I often end up picking the wrong one.
8
ChuckMcM 2 hours ago 0 replies      
We never saw poetry written about TECO.
9
sirdavidoff 4 hours ago 0 replies      
Moving to vim from TextMate, I found that I missed TextMate's autocompletion.

In Vim you have to specify whether the word you're looking for is above or below the cusror (using C-P or C-N). IIRC, TextMate just looks for the closest matching word, no matter where it is.

I wrote a little plugin to do make it exactly that:

https://github.com/sirdavidoff/vim-nearest-complete

Disclaimer: totally rough and not (yet) very customisable

10
riannucci 8 hours ago 0 replies      
Also, SuperTab is all of the win: https://github.com/ervandew/supertab/

It'll let you use <Tab> for all the completion types, and you can tell it the fallback order for various thing (i.e. try omni, then <C-N>, etc.). It's even slightly context aware, so it can guess a good first completion method for you.

All with <Tab> :)

11
va_coder 3 hours ago 0 replies      
I didn't know this one: ^X ^L
12
mun2mun 8 hours ago 0 replies      
Also checkout neocomplcache, https://github.com/Shougo/neocomplcache
13
gubatron 3 hours ago 0 replies      
"Fuck VIM" -Richard Stallman
19
My Senior Design Project: Node.js WiFi-Extending Robot glench.com
92 points by Glench  8 hours ago   26 comments top 8
1
krschultz 7 hours ago 2 replies      
Why JavaScript? This is not a flame, I'm sincerly interested.

I have made several robots from small hobby projects up to an autonomous driving car (college team). For all of those we used C or C++, with one exception that used Java. For the Java controlled robot I built a web interface that allowed you to control it remotely over wifi (similar to this, without the repeaters).

I have been thinking about using Python or Ruby for the high level logic while using C/C++ down low controlling motors/sensors. Or maybe running a JVM and doing the high level logic in Closure or Scala with JNI over the drivers.

Either way, I see why you want a high level language for the logic, but why JavaScript/Node.JS? I guess this is more remote-controlled rather than autonomous, but for autonomous systems I don't see what JavaScript brings to the table.

2
jauer 7 hours ago 2 replies      
"Unfortunately, cheap, off-the-shelf WiFi repeater units are not available and the network configuration for the devices we did find was extremely difficult, so we decided to use consumer-grade Linksys routers loaded with the open-source dd-wrt firmware."

Did you look at Ubiquiti devices? Their PicoStation would be perfect for this.

Also, and this depends on what you consider cheap, I would have used 900Mhz for the connections between repeaters, base station, and robot since it will go further through rubble. Use HWMP for mesh across the repeater links & use 2.4Ghz "access" at each repeater for access/interop with other devices. This could be done for under $500 per repeater with Mikrotik router boards and Ubiquiti mPCI cards in 900Mhz & 2.4Ghz.

3
mey 7 hours ago 3 replies      
Time to go into disaster and bomb disposal robotics by the looks of your work.

http://spectrum.ieee.org/automaton/robotics/industrial-robot...

Very nice build

4
alberich 7 hours ago 1 reply      
not to be picky, but why is it called a robot if it is remotely operated? Isn't a robot supposed to be autonomous, at least in part?

I confess that I just gave a quick look at the article, but it seems like a small remotely controlled radio tank. Is it possible to give the thing instructions on what to do (e.g. follow a path along some infrastrucutre while taking pictures of it) and then let it do the job on its own?

5
michaelbuckbee 7 hours ago 0 replies      
While the whole project is rather interesting, the most immediately applicable item might actually be the WiFi repeaters in a ruggedized box.
6
dhruvbird 7 hours ago 0 replies      
Love the idea of dropping repeaters and back-tracking if the wi-fi connection is lost. I guess you could also extend it to remember where the repeaters were dropped and try to navigate to one of these spots if the backtracking fails (imaging some debris coming in the way after the robot has moved ahead, making backtracking impossible).
7
wildermuthn 7 hours ago 2 replies      
I'm eagerly awaiting the armies of Mechanical-Turked WiFi robo-mops and robo-vacuums. Why hire real people to clean your buildings when you can get some Indian to control twenty cleaning bots at once?

Let me know when one of you hackers starts it up.

8
kpi 4 hours ago 0 replies      
Please make it fly.
20
Filepicker.io (YC S12) launches SDK for iOS and Android techcrunch.com
51 points by tagx  6 hours ago   10 comments top 7
1
joshma 5 hours ago 0 replies      
It's nice to see this on iOS - ironically, the future in iOS app inter-operability might just be in the web (connected with services like Filepicker) instead of waiting for Apple to go native.
2
jaylib 5 hours ago 0 replies      
There is a similar MIT licensed library on github from DZEN Interaktiv: https://github.com/DZen-Interaktiv/DZDocumentsPickerControll...
It's still a work in progress and only Dropbox and Cloud App are usable.
3
tagx 5 hours ago 0 replies      
If you guys want to try out sample apps on Android, here are the links to the apks:

An ImageViewer: https://github.com/Filepicker/filepicker-imageviewer/raw/mas...

An ImageSaver: https://github.com/Filepicker/filepicker-imagesaver/raw/mast...

4
sgrytoyr 4 hours ago 0 replies      
We've been testing Filepicker.io at Bugly almost from day one, and using it officially for a couple of weeks now, and it's been very solid. It's always a little frightening to rely on an external service for important functionality, but this is a pretty big deal for our users, so in this case we decided to take a chance and become early adopters. No regrets so far.
5
MIT_Hacker 6 hours ago 1 reply      
I love this! I can't wait to integrate this into one of my apps. Does it also have access to the native camera application?
6
makeee 4 hours ago 0 replies      
Nice work guys! Can't wait to use filepicker in my iphone app.
7
kmax12 4 hours ago 0 replies      
Great play on Filepicker's part. Having used their service on the web, it's a no brainer to include their service in future mobile apps.
21
ODROID-X is like a quad-core Raspberry Pi for $129 geek.com
47 points by ukdm  6 hours ago   35 comments top 12
1
drivebyacct2 4 hours ago 2 replies      
Not to be a sour-puss but there are a large number of these boards available now. It's exciting, but there really are quite a few of them. I think XBMC may be entering an even newer phase of awesome with the ubiquity of these.

edit: How does one find out what video codecs are supported accelerated by the Mali 400?

Also, wow, that's a cheap wifi module.

2
outworlder 1 hour ago 0 replies      
It is nice that there are more development boards cropping up. But in this case, it is not such as big deal as the Raspberry is.

What is a big deal is the price of their LCD modules. They are very cheap compared to what most development kits offer. It is strange that in this 'Retina' world, there are still crappy LCDs being sold by more than $200 (assembled monitors, no matter how small, costing many times that amount).

Granted, these lCDs only seem to work with ODROID-X. Anyone knows of a decent LCD module that will work with a Raspberry Pi and doesn't require sacrificing your firstborn? That's for mobile applications (like a carputer), otherwise I'd just hookup an old monitor.

3
nl 1 hour ago 1 reply      
Funny how this gets a great reception, but people think the Ouya will fail.

The Ouya is roughly the same power as this, $30 less, in a nice case with an out-of-the-box consumer experience.

4
ChuckMcM 3 hours ago 2 replies      
Nice, looks like a fun little box. Although its another data point in the 'how-the-heck-is-ouya-going-to-cost-99-dollars' question. I wonder if Samsung is more forthcoming on how to program the GPU than either Broadcom (RPi) or TI (Panda/Beagle) are.
5
yuvadam 4 hours ago 1 reply      
Hmm, no obligatory info on how the "quad-core" affects power consumption vs. the Raspberry PI.
6
yardie 4 hours ago 3 replies      
Yes, just like a Raspberry Pi except without GPIO headers, and 4X the price.

A great Nettop PC but a RaspberryPi it is not.

7
joshu 2 hours ago 1 reply      
are there any arm boards that are supported by ubuntu/debian/etc out of the box?

i bought a couple of boards and they all run their own minor variants.

8
rektide 58 minutes ago 0 replies      
Four times the cores, four times the cost!
9
stevenleeg 4 hours ago 2 replies      
I like where we're headed with these sorts of devices coming out. There appears to be some serious demand for powerful and tiny computing devices that are cheap enough to almost be considered disposable. This makes them perfect for weekend projects that require a little more umph than a typical Arduino can handle (and in some cases, like the Raspberry Pi's, they can even be cheaper).

I'm excited to see what the future brings in this nook of the technology/hobbyist industry.

10
gcb 4 hours ago 0 replies      
put that into a ms natural keyboard, a LCD screen in the center, call it Fuchi Cyberdeck I, and sell it to me for 10x more!
11
nivertech 4 hours ago 2 replies      
Does ODROID-X ARM Mali GPU supports OpenCL?
12
duaneb 4 hours ago 2 replies      
Is there no link or am I just blind?
23
Automatic Remote Image Loading In an iOS Table View Controller parse.com
22 points by jamesjyu  3 hours ago   11 comments top 3
1
sunsu 2 hours ago 1 reply      
I suggested this feature on the last HN article when they announced Parse powered Table Views. No one seemed to think it was a good idea then, but I'm very glad parse decided to implement it! Reference: https://news.ycombinator.com/item?id=3401887
2
jazzychad 2 hours ago 1 reply      
Does this load images as the table is scrolling? Or only loads onscreen images when the scrolling stops? Do I have any control over which behavior I want?
3
thathoo 2 hours ago 2 replies      
How is this better than SDWebImage or AFNetworking libraries? Any quick comparisons?
24
What PHP 5.5 might look like github.com
140 points by nikic  12 hours ago   109 comments top 21
1
Smerity 12 hours ago 7 replies      
The introduction of list comprehensions is nice and should replace numerous functions. For example, I'm not sure why they're adding the array_column function into PHP 5.5 at the same time as list comprehensions as:

  $names = array_column($users, 'name');
// is the same as
$names = [foreach ($users as $user) yield $user['name']];

The only possible reason would be due to a significant speed difference, but I'd suggest improving the efficiency of the list comprehension system (even if just for common cases) than adding more functions.

As opposed to the introduction of list comprehensions, I can't say I like the solution for parameter skipping though... I disagree with the author -- many optional arguments is not a problem, but only if keyword arguments or a similar style are used. Consider the example they give:

  // This create query function in PHP
create_query("deleted=0", "name", default, default, false);
// could be represented like this with keyword arguments
create_query("deleted=0", "name", escape_arguments=false)

I tend to find keyword arguments serve the purpose of self documentation as well -- I still have no clue what the false flag would be for the PHP create_query statement.

Keyword arguments and many optional arguments can allow for beautiful and flexible functions, such as the Python Requests library[1].

  r = requests.get('https://api.github.com/xyz')
# or we could add a few more complications
# -- no "defaults" in sight and also self documenting
r = requests.get('https://api.github.com/xyz',
auth=('user', 'pass'), timeout=0.5, allow_redirects=True)

[1]: http://docs.python-requests.org/

2
gurkendoktor 9 hours ago 2 replies      
> PHP 5.5 will no longer support Windows XP and 2003. Those systems are around a decade old, so PHP is pulling the plug on them.

Given the recent threads about planned obsolescence (Apple) and OS fragmentation (Android) I'm curious what is missing from XP/2003 that is part of Vista+. Are there still exciting things happening in the world of OS APIs that are relevant for server software?

Or does this just mean that noone will actively test the binaries on XP anymore?

3
geon 10 hours ago 0 replies      
I was confused by the password hashing example since it doesn't appear to use a salt (which was the main reason to implement it at all). Even more confusing, the RCF says it will auto generate a salt if none is passed.

The explanation, a bit further down in the eRFC is that the function password_hash doesn't actually return a password hash, but a string including the algorithm and options used, the salt and the hash itself.

From the RFC:

> It's important to note that the output of crypt() (and hence password_hash()) contains all the information that will be needed to verify the hash later. Therefore, if the default hashing algorithm changes, or the user changes their algorithm, old hashed passwords would still continue to function and will be validated properly.

4
bergie 10 hours ago 1 reply      
These sound like useful improvements, especially the new getter/setter syntax and scalar type hinting.

However, what I'd really like to see in next PHP would be the Composer dependency manager bundled in, a bit like Node.js nowadays ships with NPM. This has really brought a new world of cross-project code sharing into PHP:

http://packagist.org/

I wrote a blog post on why this is important for improving the state of PHP:

http://bergie.iki.fi/blog/composer_solves_the_php_code-shari...

5
pbiggar 9 hours ago 1 reply      
I was involved with PHP internals when type hinting came up for 5.3, and helped steer that discussion in some way [1]. It was a truly awful experience - at the time, the internals community was a very negative and poisonousness place. Does anyone still involved know if that has been fixed?

[1] oh look, I'm still mentioned in the RFC :) - https://wiki.php.net/rfc/scalar_type_hinting_with_cast

6
lysol 11 hours ago 2 replies      
This is just a "would be nice" sort of post right? The main feature I'd want (list comprehensions) seems to be just a single programmer's guess at an upcoming feature with no real evidence.
7
debacle 12 hours ago 3 replies      
Pretty exciting, though empty() should be deprecated, not improved upon.

The bit about getting the fully qualified class name is important, but it still prevents you from doing something like:

  function builder_factory( $var ) {
$class = $some_array[ $var ];

return new $class();
}

So you have to write a ton of boilerplate for something that used to be easy without namespaces, or write namespace traversal into your PHP (which isn't THAT hard, but is very ugly).

Parameter skipping looks intuitive and useful. Very important as we incorporate more functional paradigms into the code.

I don't believe scalar type hinting will make it in. There's just too much discussion around it. IMHO, it should be super strict. "1" is not an integer.

Getters and setters: meh. I guess it's good. Better than what we have now.

I don't believe PHP will do generators or list comprehensions right, so I'm not holding my breath on those.

8
Nervetattoo 12 hours ago 1 reply      
Consistency in dereferencing is very important.
empty() should at some point be deprecated, but fixing this erratic behavior for now is good imo.

The default value for method arguments is a shockingly bad idea for a new feature, it only supports the old, bashed upon, code quality that have been PHPs greatest legacy problem.
If anything in this alley I'd like to see named arguments somehow.

I'm not so sure about property getter/setters as I find the syntax a bit awkward, all while magic methods lets you create getter/setter based APIs.

9
masklinn 11 hours ago 1 reply      
I think a standard password hash API is a good thing (there was a proposal for such a thing for Python recently on -ideas, though it was essentially rejected in favor of recommending/using passlib), but I'm wary of having default cost factors, that seems unwise: what are the defaults, how are they decided to be ok, and more importantly in what context are they updated over time (and based on what data)?
10
snorkel 9 hours ago 2 replies      
Conspicuously absent from PHP:

   array_copy()

... in order to copy-by-value on arrays containing references. Currently to do that you have to do a very fragile hack like this:

   $copy = array_flip(array_flip($original));

... which is vulnerable to key-value collisions ... or you have to write your own homebrew array copy-by-value function.

11
musashibaka 11 hours ago 3 replies      
17 years old and a PHP Internals tinkerer...

Awesome work Nikita, keep it up!

12
giulianob 11 hours ago 0 replies      
The "parameter skipping" functionality is really useful but IMO they should take the same route that C# did with named parameters.

    function create_query($where, $order_by, $join_type='', $execute = false, $report_errors = true) { ... }

create_query("deleted=0", "name", report_errors: true);

can even specify all parameters by name in different order

    create_query(order_by: "name", report_errors: true, where: "deleted=0");

13
mikeash 10 hours ago 2 replies      
Regarding the "constant dereferencing" proposal, how do you even make a language parser that can't apply array operations to literals in the first place? I'm trying very hard not to bash on PHP here, and this is a completely honest question. I can't figure out how you'd even go about breaking that if you wanted to. Anyone know the background there?
14
Sander_Marechal 7 hours ago 1 reply      
The proposals look great! But I'm missing one thing: Less fatal errors, or a better way to catch them. Especially for the case of calling a method on a null value. E.g.

    $foo->getBar()->getBaz()

If getBar() return NULL for some reason, you get a fatal error which you cannot catch and handle without reverting to uglu hacks with a shutdown function.

My personal preference would be to have that statement return NULL and raise a warning or notice, alike to using uninitialized variables.

15
mmuro 11 hours ago 0 replies      
Yet another array function. But, array_column does seem super useful.
16
psaintla 7 hours ago 0 replies      
If there is scalar type hinting I wonder if that will lead to real method overloading by adding methods with different signatures or if devs will have to stick with the __call magic method, which I despise.
17
zapt02 11 hours ago 0 replies      
The getters and setters change is looking great! Overall lots of good ideas!
18
velodrome 6 hours ago 1 reply      
Is PHP JIT at least on the roadmap?
19
ShaneCurran 10 hours ago 0 replies      
Laravel already does alot of this stuff ^^^
20
josteink 10 hours ago 0 replies      
If you are fine running a decade+ old OS, chances are you don't have a dire need to run a bleeding edge version of PHP locally.
21
ardillamorris 11 hours ago 7 replies      
No matter how much PHP is improved upon, until there is a serious contender framework for it like Rails or Django, PHP will continue its route to extinction. Dinosaurs were once big and powerful. They dominated the landscape. They don't exist anymore.
25
Some things I've learnt about programming jgc.org
211 points by jgrahamc  16 hours ago   113 comments top 20
1
AngryParsley 14 hours ago  replies      
Programming may be a craft, but researchers have published tons of studies about this craft. Many of these studies contradict anecdotal evidence. For example, copying code isn't as bad as you might think: http://www.neverworkintheory.org/?p=102

Another example is TDD. People espouse the benefits, then some study comes along (http://www.neverworkintheory.org/?p=139) saying the benefits are largely illusory and that code reviews are more effective.

Instead of listening to the experts at programming, listen to the experts on programming. Read some studies about the effectiveness of various tools and methods. Try new things. Programming is a craft, and like many crafts it contains significant amounts of dogma passed from teacher to apprentice.

2
defdac 15 hours ago 10 replies      
"You have to ask yourself: "Do I understand why my program is doing X?""
"I almost never use a debugger."
"And if your program is so complex that you need a debugger [simplify]"

To me, being afraid of a debugger is like being afraid of actually knowing exactly what is going on - being lazy and just read logs and guessing what might have gone wrong, instead of letting the debugger scream in your face all the idiotic mistakes you have made.

I would argue that using the debugger is being lazy in an intelligent way, instead of spending hours reading endless logs trying to puzzle together logic the debugger can show you directly.

3
bambax 14 hours ago 0 replies      
A lot of those things apply to many human activities.

I don't build bridges but I would be very surprised if an architect described his work as "pure science and no craft at all" (how would it be possible, then, to build beautiful / ugly bridges?)

I do a little woodworking and have many tools; friends sometimes look at my shop and ask if I really need all that -- yes, I do. In the course of a project you get to use many different tools. You can get around to missing one but it takes exponentially longer to work with not the exact tool. (Same thing with photography).

I'm learning to fly, and the most important word regarding human factors is "honesty". The way to fly is not to avoid mistakes, it's to detect them and minimize the consequences; if you feel you can do no wrong you'll eventually kill yourself.

4
vbtemp 12 hours ago 1 reply      
> 0. Programming is a craft not science or engineering

Unless, of course, you are software engineer-ing.

Flight guidance-and-control systems, among many other things, are are precisely engineered software systems. In a world of web-apps and mobile-apps, people tend to forget this kind of software exists.

Sure, working on your web app, writing some JQuery widgets, or coding up some python scripts is a craft.

5
StavrosK 15 hours ago 8 replies      
I never noticed until now, but I never use a debugger either. An employer urged me to start using one while I was working on their code, but it's just not natural for me. If I need to debug a crash, I just print the 1-2 variables of the state involved, see what's wrong, and fix it.

Is there anyone who uses a debugger for more than inspecting state?

EDIT: I guess lower level languages and more involved applications use debuggers much more extensively.

6
asto 15 hours ago 4 replies      
A large part of the post can be rewritten as "don't be lazy".

1. Don't be lazy and just do something that works without taking the time to learn why it works.

2. Don't be lazy and just stop when you have something that works. Go through the code again and see if you can make it better.

4. If you find yourself writing the same thing twice, don't be lazy and carry on, put the code in a single place and call it from where you need it.

Or at least that's how I see it. I do all of the things I shouldn't do, largely because doing things the wrong way is so much easier!

Edit: Rather than rewritten, I meant "falls under the general category of". The article was great!

7
zethraeus 11 hours ago 0 replies      
It's nice to see number 5.

>Be promiscuous with languages

>I hate language wars. ... you're arguing about the wrong thing.

It's easy to take this for granted, but it's a concept that is very important to stress to new coders. If you spend too much time focusing on one language you run the risk of the form becoming the logic . This is a dangerous place where your work can be better analogized to muscle memory than to logical thought.

At least in a college environment, I think this lack of plasticity causes discomfort with different representations of similar logic - and so flame wars abound.

8
wissler 14 hours ago 1 reply      
Taking personal pride in not using a debugger is a bad idea. Sometimes it's the right tool for the job, and if your picking it up makes you feel dirty, you're only handicapping yourself.
9
Tashtego 12 hours ago 0 replies      
"I'm not young enough to know everything"

Having recently started mentoring/managing the first really junior engineer on our team (self-taught, <1 year programming experience), boy does this ring true. Luckily I'm of the temperament to find the "advanced beginner" stage of learning more funny than annoying.

I think it's possible to understand as little about your code when using loggers as when using debuggers, so I have a hard time agreeing with him there. I think his general point about having tools and knowing when to use them applies just as much to that as it does to language, so he contradicts himself.

10
einhverfr 14 hours ago 0 replies      
These are all golden lessons that people who think about writing code generally learn.

One thing I would add though is that there are many times when there is time pressure and a kludge works. The right thing to do here is to document that it is a kludge so that if/when it bites you later you have a comment that attracts your attention to it.

"I don't understand why this fixes the problem of X but this seems to work" is a perfectly good comment. It's great to admit in your comments what you don't know. (That's why questions relating to commenting are great interview questions IMO.)

Finally, I think it's important in the process of simplification to periodically revisit and refactor old code to ensure it is consistent with the rest of the project. This should be an ongoing gradual task.

Anyway, great article.

11
10098 4 hours ago 0 replies      
> I almost never use a debugger. I make sure my programs produce log output and I make sure to know what my programs do.

I used to do precisely that. Sprinkle code with log messages, recompile and run. When I finally learned how to use gdb, my debugging productivity increased tenfold.

I mean, just the ability to stop your program at any given point gives you an enormous advantage. You can not only examine the local state of your program, but also you can see how the state of systems outside of your program (e.g. database) changes, and all of this without polluting the code with tons of useless debug messages.

Often when I had new ideas during bug hunts, to test my hypothesis without a debugger, I had to go back and add new logs, the recompile, then run (and make sure it reaches the same state as before!) - lots of wasted time. With a decent debugger it's as easy as typing an expression.

And I don't think debuggers lead to lazy thinking. The process of finding the problem is the same whatever method you use - you analyze the code, have an idea about what could be wrong, change one thing, then see what happens. Debuggers just make it easier.

12
bromagosa 13 hours ago 0 replies      
The debugger part doesn't look generic enough to me. As a Smalltalk programmer, I can only say usage of debuggers depends _a lot_ on which language you code in.

In Smalltalk, you practically live inside the debugger. Also, if you are an ASM programmer, the debugger is indispensable.

13
raheemm 15 hours ago 2 replies      
#7. Learn the layers. Is this even possible anymore? Seems like with apis, frameworks, there are layers upon layers just within code. Then you have the OS, the hardware, the network layer (whoa! 7 layers right there!)...
14
mustafa0x 15 hours ago 2 replies      
> Programming is much closer to a craft than a science or engineering discipline. It's a combination of skill and experience expressed through tools

You seem be implying that the latter statement doesn't apply to the disciplines of science and engineering. "skill and experience expressed through tools" is highly important in both watchmaking and bridge building. I would advise anyone who says elsewise to reconsider.

I understand your point, but why create a hugely false dichotomy between a craft discipline and the science and engineering disciplines?

---

I strongly concur with points 2 and 6.

15
HarrietJones 15 hours ago 1 reply      
Good article, though I don't agree with it all.

It is harder to grow software than it is to initially build it. Preconceptions bite you on the ass, data structures don't allow for new features, side effects multiply.

You don't need to learn the layers. In fact, if you're learning all the layers, you're probably an innefective coder. This is not to say that you shouldn't investigate the layers or have a poke around the layers. But, software's about reuse and reuse is about reusing other people's work via known interfaces without worrying overmuch about what goes on underneath the hood.

I'm actually more of a debugger than a profiler, and as much as I'd like to believe that my way is as valid as his, I suspect that he's probably right on this and I'm probably wrong.

16
MindTwister 15 hours ago 0 replies      
9. You count from 0
17
plg 2 hours ago 1 reply      
Is "learnt" a real word?
18
alainbryden 11 hours ago 0 replies      
I like that you starting numbering at 0 - very apropos.
19
cryptide 10 hours ago 2 replies      
>>a printf that's inserted that causes a program to stop crashing.

Huh?

20
pjmlp 15 hours ago 0 replies      
Quite interesting
26
How Amazon's ambitious new push for same-day delivery will destroy local retail slate.com
465 points by rmason  1 day ago   299 comments top 52
1
wheels 23 hours ago  replies      
Amazon has been doing this in parts of Germany since 2009. The results haven't been nearly as dramatic as the article predicts. As others have indicated, the limiting factor isn't so much speed of delivery as the inclination to physically inspect items before buying them. That said, Zappos, now a part of Amazon, has succeeded in doing that in one of the markets that tends the most towards such. Even so, there are still a lot of places to buy shoes.
2
CitizenKane 19 hours ago 2 replies      
This is already happening in China. Because of the nature of shipping companies here it's not unusual to buy a product (typically off of http://taobao.com) and get it later that day or the next day. Many of these businesses are based in Shanghai and if you live there delivery happens nearly instantly. One online shop, http://cheers-in.com/ delivers cold beer in Shanghai in 1 - 2 hours from an order. Stuff like this is absolutely fantastic and it would be amazing to see it come to the US.
3
fsckin 22 hours ago 4 replies      
Amazon failed to collect/remit ~270M in sales tax in Texas and has owed millions for years. So what did the state of Texas do about it?

They struck a deal to erase the 270M owed, as long as Amazon starts collecting sales tax starting 7/1, create 2000 jobs and invest 200M in Texas. They were likely already planning more infrastructure in Texas, but threatened to reverse course and pull out of the state entirely.

The way I see it, Amazon bluffed the state of Texas to the tune of 270 million dollars and all I might get out of it is next day delivery?

I'll take what I can get, I guess.

4
Caerus 1 day ago 6 replies      
> Amazon has long enjoyed an unbeatable price advantage over its physical rivals. When I buy a $1,000 laptop from Wal-Mart, the company is required to collect local sales tax from me, so I pay almost $1,100 at checkout.

This is such an over blown argument. Sure, Amazon is ~7% (where I live) cheaper than traditional stores due to sales tax. But that $1000 Wal-Mart laptop has a $900 sticker price on Amazon, and most non electronic items are 20-40% cheaper than in stores.

If laws change and I have to start paying sales tax on Amazon, it won't change a thing about my buying habits.

Edit: They also have an inventory many times larger than any brick and mortar store. Whenever I go shopping, I have to choose between the least crappy option Wal-Mart decides to stock. On Amazon, I get exactly the one I want.

5
russell 32 minutes ago 0 replies      
I actually have a problem with Amazon's shipping. I live in a small town. Often FedEx shipments are dropped off at the USPS, so a two day delivery turns into three or more days. The worst was 5 days. UPS OTOH often delivers the next day.
6
PaulHoule 23 hours ago 3 replies      
A general policy that "online stores pay sales tax" also benefits Amazon vs. small internet retailers.

A while ago I sold a few bumper stickers online and ended up using cafepress. I could have made a better profit by printing the stickers in bulk and mailing them to people, but I'd have to spend $100+ on paperwork just for the privilege of paying sales tax just in case I sell any to New Yorkers.

If small internet businesses had to pay taxes to the 40+ states that have sales tax plus to all the other jurisdictions (cites, counties, who knows what) in the U.S. it would be almost impossible to sell stuff and comply with the the law.

For AMZN the overhead is nothing.

7
api 1 day ago 4 replies      
More technology driven hyper-deflation on the way.

The future: high inflation in food, energy, fuel, and consumables, hyper-deflation in everything else except to the extent that it depends on or consumes the former.

8
mthoms 22 hours ago 1 reply      
This is the trojan horse. Once Amazon has a large presence in each major centre, the next logical step for them is use some of their massive space as a showroom (think Ikea but on a much larger scale).

[Edit: To clarify, the showroom and warehouse would be in the same complex but separated. Shoppers + heavy merchandise + fast moving robots is a recipe for disaster.]

Then they can satisfy both the "I want to see it before I buy it" crowd as well as the "I know what I want - just give me the best price" crowd.

Mark my words. Amazon has the Costco's, Walmart's and Best Buy's of the world squarely in their crosshairs.

*Of course there will of course always be specialty categories that are too niche to fit in this model, thus many specialized retailers will still exist.

9
tomfakes 1 day ago 3 replies      
I bought 2 things on Amazon this week, both with free 2 day shipping with Prime

Item 1 spent 13 hours on a UPS truck driving around my city and was delivered about 7pm in the evening 2 days after ordering.

Item 2 was delivered 14 hours after purchasing by Amazon Fresh at 9am.

For the same price of shipping, which service would you rather have?

EDIT To Add: The delivery guy for the 14 hour item works for Amazon - the whole experience was produced by Amazon without needing a third party. UPS is another company that will be in trouble if Amazon can make this scale.

10
stretchwithme 7 hours ago 1 reply      
Because the last mile is always so expensive and subject to theft, I think we'll eventually have a centralized local pickup location for all sorts of small deliveries. We'll just stop there on the way home if something has arrived.

That will even include your postal mail if the postal service ever wakes up. There's no need to physically deliver mail every day if people could see what mail they've received remotely and can pick it up or request deliver if they can't leave home.

11
andyl 1 day ago 0 replies      
Retailers pushed to impose the sales tax on Amazon. Now it looks like they are going to get what they asked for.
12
zackmorris 55 minutes ago 0 replies      
I thought of this November 3, 2010 :-)

http://beginwithyou.org/2010/11/03/the-next-ebay/

13
JoeCortopassi 1 day ago 5 replies      
Isn't this just one step away from Amazon just being another brick-and-mortar? If this is the case, is not having an actual store that is accessible by customers (and coincidentally, the neccessary staff), that much of a operational advantage?

Or is this just a case of the more efficient company (Amazon) beating out less efficient companies (Best Buy, Barnes and Nobles, etc...)?

14
dr_ 23 hours ago 1 reply      
"I have no idea how Amazon made any money on my order (the whole bill was less than $30) but several people on Twitter told me that they've experienced similarly delightful service."

Therein lies the problem. At some point there will have to be revenues to justify the company's valuation. I suppose their goal is to initially obliterate all competition in entirety and then have everyone purchase from Amazon. I'm doubtful this will work. Of late, there has been a trend towards experience stores - with manufacturers creating their own stores instead of distributing to retailers. Many luxury brands do this and even some non-luxe ones, such as Samsonite, have been getting into the game. There's some value added here, and it's something Amazon won't be able to directly compete with.

15
ddt 1 day ago 5 replies      
It'll destroy larger B&M stores but I doubt it'll fully outdo local specialty shops. What you'll see is a stratification between hyper specialty B&M that sell luxury items only a tiny subset of people want, but are willing to pay out the nose for, and places like Amazon filling the role of Target and Walmart, being the catch-all for everything else that most people need on a week-to-week or month-to-month basis. While you might be able to buy a certain brand of organic mustache wax on Amazon, I don't see a day coming where they'll be able to do that same-day.
16
sageikosa 4 hours ago 0 replies      
At long last, the dreams of the Coyote ordering from Acme and getting immediate delivery of his latest gadget to catch the Roadrunner come closer to fruition.
17
bennesvig 1 day ago 5 replies      
The only advantage local retail stores have is immediacy. You can buy it as fast as you can drive there (and park/find it in the store/wait in line). Amazon has almost every other advantage. Ive come to find shopping local retail stores more and more frustrating. Shopping without reviews or videos, and having to flag down employees to help you locate items. It's really hard to beat online shopping with one day or two day shipping. Not impossible, but challenging. Target, Best Buy, and other generic mass retailers would be hurt the most
18
jonhendry 7 hours ago 0 replies      
Local retail is their own worst enemy, at least those that have websites.

If I go to the website of a brick and mortar store, chances are I want to know what they carry in-store, because I need something specific and am planning to visit a store.

Ideally I'd be able to see what they have in stock at the local stores, so I can avoid a needless trip.

Instead, many retailers have larded up their online stores with online-only products. Some don't even give you an easy way to exclude those and see only in-store items.

This is stupid, and just makes me more likely to not bother visiting the store at all, opting to just order from Amazon.

19
ktr 1 day ago 1 reply      
This is crazy - I ordered 3 things from Amazon today through Prime and started wondering if/when a day would come when you'd have same day delivery from Amazon and what it would look like. I figured it would happen someday, but thought the complexities would be too much to handle for a while. Looks like they're way ahead of me. This is why I love Amazon.
20
ori_b 23 hours ago 0 replies      
When Amazon lets me physically try out the feel of things while browsing them, see how well built a tool is, or how a piece of clothing fits, it might have a chance of killing local retail.

Until then, I like to see which pants fit best, which knife is most comfortable in my hand, which tablet is most responsive. Yes, if I already know what I want, I'll go to Amazon. But if I'm not entirely sure and I want to compare things, I'll go to a local store.

21
k-mcgrady 11 hours ago 0 replies      
I don't think it will destroy local retail. All the businesses within a 10 minute walk of your home will be fine, it's still more convenient than same day shipping unless your very busy. It's the businesses slightly outside the core of the town that will suffer. If it takes you 30-60 minutes to drive to a store you might order same-day from Amazon.
22
Shivetya 8 hours ago 0 replies      
I remember my grandmother telling us how first the bus and then the family car let her shop where she wanted to. Stores adapted. Face it, lack of transportation will allow bad service to exist simply because the customers are trapped.

When your the only game in town, well.

So I don't buy the dire predictions.

23
peppertree 1 day ago 1 reply      
Amazon is playing chess while the brick and mortar stores are playing checkers.
24
learc83 23 hours ago 3 replies      
The other day I ordered something from Amazon, and it showed up in a small cargo van, delivered by a guy without a uniform, and in a package with no UPS or FEDEX label (or any other delivery label I could discern apart from the amazon labels).

Was this some kind of test for Amazon ran delivery from a local warehouse?

25
damian2000 20 hours ago 1 reply      
Here's the original Financial Times article that this one was mostly taken from (nb: this link avoids their paywall via google's url redirection):

http://www.google.com/url?q=http://www.ft.com/intl/cms/s/0/9...

26
jusben1369 23 hours ago 1 reply      
Remember when home videos were going to obliterate movie cinemas? Who would go out when they can watch a movie on their couch?! For a while too things dipped. Then people realized there was a social element to going to the movies that made it a compelling experience. Add to that iMax and 3D etc. Heck, who would bother going to an Apple store when you can buy everything online!

Amazon is to shopping what McDonalds is to food. We all know what happens when you have McDonalds every day.

27
mseebach 22 hours ago 0 replies      
> Physical retailers have long argued that once Amazon plays fairly on taxes, the company wouldn't look like such a great deal to most consumers.

I love the audacity of "If you can't beat them, find some other way to beat them."

28
zeroonetwothree 18 hours ago 0 replies      
Workplaces should be an easy place to get same day delivery. At a large office you probably have 100+ Amazon orders each day that conveniently get delivered by the company's staff. Amazon could have their own truck just drive straight to the office building without getting UPS involved at all.
29
ams6110 10 hours ago 0 replies      
I don't see how they can possibly achieve the same economies of scale with a lot of small local distribution centers vs. a few huge ones. Plus coordinating deliveries in all those local markets vs. just having UPS or FedEx handle that part. To a point this may be successful, but it sounds self-limiting to me.
30
baak 5 hours ago 0 replies      
To be honest, I feel this has a good outcome for environmentalism as well. Same day delivery putting physical retailers out of business means less overall driving.
31
hef19898 18 hours ago 0 replies      
The biggest challenge I see for amazon to come is, besides competition and makets and all that their ability to handle their whole supply chain. Amazon already proved that in terms of warehousing and distribution they are really ggod. but if there is one thing that comes with an ever increasing number of arehouses, especially if you want same-day delivery, is a ever increasing inventory level, in both number of items and value.

So from the outside one cornerstone of a strategy like this would be inventory management at at least regional if not even warehouse level. It's do-able, no question, but difficult. An advantage amazon has here is a huge history of point-of-sales data for most important regions they are operating in, no matter if there actualy is a amzon warehouse or not. And if they don't have (enough) point of sales data, well then it's to early to offer same-day service and all that.

All amazon has to do now is to keep operations up to the task... :-)

32
hrktb 11 hours ago 0 replies      
I'd have died to have that in the US or Japan for e.g., anywhere delivery of basic goods actually works and isn't painful. I feel the main barrier to this is to have the local delivery company deliver things on time and in a friendly fashion.

That's not something you could expect in any country amazon operates (french's chronopost delivery is really hostile for ex., but there should be a ton of others). It'd bear with it for things I expect in weeks anyway, it would be horrible to have it for goods that are supposed to be there today.

33
adventureful 23 hours ago 0 replies      
I thought Walmart already destroyed traditional local retail?

This one goes in the scaremonger bag.

34
grandalf 23 hours ago 0 replies      
I already don't shop locally except on a whim and when there are items I'm not price sensitive about at all. Well, groceries are an exception but I like to pick my own produce.
35
jsavimbi 23 hours ago 0 replies      
I don't shop local any more unless it's for a premium item and the customer service is beyond average. Not when Amazon can deliver consumer items the next day.
36
ilaksh 19 hours ago 0 replies      
OK so pretty soon Amazon will have same day delivery. That's going to be great!

Then, can we have evacuated tubes that connect our homes to all of the retail outlets, or homes to homes maybe also, and then they could have like 5 minute delivery, or however long it takes the picking robot to get it and then for it to travel at hypersonic speed through the tube?

Maybe we should all live within a few hundred yards of the warehouses. Then upgrade the picking robots and streamline the warehouses and tubes so they can deliver items in less than a minute.

Then we can all have robots that pull the items out of the tube, open the packaging and hand it to us on our couches/beds.

I would totally buy a giant plastic jug of cheeseballs right now if it could arrive in less than a minute and be delivered to my hands.

37
S201 1 day ago 2 replies      
Looks like Amazon is finally bringing Webvan to fruition. Only took 11 years.
38
parka 9 hours ago 0 replies      
I hope this applies to international shipping as well.

I buy stuff regularly from Amazon and their prices are even cheaper than what my country's local stores are offering after shipping included. E.g. A book from Amazon cost ~30% cheaper than the shops here in Singapore.

39
webjunkie 17 hours ago 0 replies      
In Germany, when you order in the morning before noon, you will most likely get your stuff by the next morning without any special shipping at all. That's how efficient the German postal service is :)
40
arjn 18 hours ago 1 reply      
My opinion is that this may not be the end of local retail. I would still prefer to shop at my local grocers and farmers market on the weekend. Sometimes shopping is more than just shopping, its also an outing with family.
41
tmuir 1 day ago 0 replies      
In the chicago area, you can order from McMaster Carr (mechanical hardware), and get same day delivery. Any company that builds anything mechanical is probably ordering something from McMaster Carr. This isn't an impossible problem.
42
stephen272 18 hours ago 0 replies      
First the internet revolutionized music and eliminated record stores
Then it revolutionized videos and took down video stores
It didn't stop there, and moved on to bookstores and wiped them out.
And now its taking out brick and mortar stores in general.
Honestly, I LOVE IT! The internet is so great. Anything that gets stuff in my hand faster or easier is great in my eyes.
43
Scene_Cast2 1 day ago 1 reply      
There are a few fundamental, hard problems : energy and transportation. Amazon is trying to solve a subset of the latter here. However, I like to see/feel some things in person before buying. I often go through lot of items before finding "the one". Examples: food, pens, shoes. It would revolutionize the industry if Amazon could solve that. Until then, it's just a really large online store.
Also - USA isn't the world. Non-US Amazon is heavily sub-par to other online stores: worse prices, small selection. So there's that, too.
44
snambi 7 hours ago 0 replies      
Amazon is trying to become a specialized shipping service?
45
dave5104 22 hours ago 1 reply      
I'm surprised Amazon just hasn't started their own logistics company. With the spread of their warehouses now, is the next logical step becoming their own USPS/UPS/FedEx? I can only imagine what would happen once Amazon takes over the part of the process that seems to always have the most problems.
46
wilki 17 hours ago 1 reply      
I do wonder how much online retailers have saved by not dealing with shoplifting and the overhead involved with a fleet of loss-prevention employees.
47
emperorcezar 21 hours ago 1 reply      
A business is required to pay sales tax, not collect. Like some other businesses, Amazon can just include the tax in the price.

For instance, bars will include the taxes in the drink tax because if they show you how much tax is in a glass of beer you'd be very surprised, it's something like 40%.

It's an mental game though. Do you get people "through the door" with an item at $19.99 and hope they don't close the tab when they see the $1.00 tax, or do you just advertise it at $21.99 and hope that people are attracted to you because of name, etc even though joesonlineshop.com has it for $19.99?

48
mariuolo 15 hours ago 0 replies      
I can think of several local stores I would like to be erased from existence.
49
kfury 23 hours ago 0 replies      
TL;DR: "Amazon's going to wreck retail now that they're giving up their unfair advantage and relying solely on better service."

Boo hoo.

50
sanjiallblue 20 hours ago 0 replies      
Once 3D printing tech gets off the ground, same-day delivery could end up becoming common or even the standard.
51
ck2 23 hours ago 2 replies      
What this is going to do is punish people in non-metro areas.

The way that happens is the same way supermarkets work. In very competitive areas, cut rate prices, coupons and promos like doubling, etc. are offered very liberally. To make up for that profit loss, they pump up the prices and cut out promotions in areas where there is little to no competition.

In areas with a walmart and target and now amazon local delivery, prices are going to be crazy good.

Everyone else will suffer as they supplement the profit-loss.

52
craze3 1 day ago 6 replies      
(Disclosure: Slate participates in Amazon Associates, an "affiliate" advertising plan that rewards websites for sending customers to the online store. This means that if you click on an Amazon link from Slate"including a link in this story"and you end up buying something, Amazon will send Slate a percentage of your final purchase price.)

Do they really think that they'll get more affiliate conversions by being the honest guys? Seriously, what is the point of this?

27
Show HN: DeadMouse " Surf the web with just your keyboard. github.com
6 points by chetan51  1 hour ago   6 comments top 5
1
positr0n 11 minutes ago 0 replies      
I use vimium [1] which has this feature (search for links to open them) and much more. It doesn't have the cool effect though :)

[1] http://vimium.github.com/

2
YellinBen 7 minutes ago 0 replies      
The wiggling is cute at first, but I think a more straightforward method of highlighting would make it easier to find the active link.
3
bluespice 15 minutes ago 0 replies      
That's pretty clever, I think this should be implemented natively in browsers. It's nice that it's bound only to the visible part of the page too.

But let's face it, if web devs were conscious about keyboard usability we'd at least see tabindex html property used sometimes, and it's been around for a while.

4
richo 10 minutes ago 0 replies      
So it's like vimium but you have to type more?
5
lightyrs 48 minutes ago 1 reply      
Thanks!
28
Should Applied Research Funding Go To Startups or Academia? scienceexchange.com
25 points by bmahmood  5 hours ago   19 comments top 8
1
jayzee 4 hours ago 1 reply      
This is a great post on a topic that is quite close to our hearts since we are a science-startup as well. I think that this situation is a symptom of the mentality, 'You can't be fired for buying IBM.' It is so much easier to give money to a research group at Harvard etc than 2 guys starting out from an apartment. Especially in a bureaucratic setting where your job is to make decisions that do not look bad, and not to actually make good decisions!

Adam and I were once on a call with an academic consortium that had received ~$15M in funding from ARAA. The objective of this group was quite similar to what we are doing with our startup. This group reached out to us too see if there was a way we could work together. During the conference call it was Adam and me on our side and at least 10 researchers from institutions all over the US on the other side!

Those were the early days of our startup and we thought that we would be obliterated by them with their vast resources. It turns out that with 1/10th their funding we have done more than 10x of what they did. And they are floundering while we are just warming up.

2
qq66 1 hour ago 0 replies      
The returns to academic funding are extremely high and if anything, MORE government money should be funding academic research. The strong funding for academic research is one of the principal reasons for the US' level of scientific and technological success.

For-profit business already have well-established funding channels, namely angel investors and venture capitalists. Government doesn't do a good job of investing in startups (Solyndra?) and should stay out of it.

3
Irishsteve 4 hours ago 1 reply      
Applied research is nothing new, sure isn't that what PARC was? Commercial labs still exist, and in many different fields. (I know PARC was Xerox, but now they take on proof of concept work from externals)

The real problem is how do you identify particular groups or commercial labs to approach, so that they can complete your work.

Research money is quite risky because you are asking for someone to achieve something that does not yet exist. You have a good probability of running out of money before something comes back to you.

4
PaulHoule 4 hours ago 1 reply      
arxiv.org is perhaps the only successfully community to come out of academia and it was successful precisely because it wasn't funded... all of the early work was done on stolen time so there was no BS having to do with grants, etc.

Back when I was involved with arxiv.org we had 1/8 the budget of some people next door who'd build a huge portal that had essentially no end users. Perhaps we could have done so much more if we'd had more money, but practice shows that academics will eat the money up and deliver very little for it.

5
tjic 5 hours ago 1 reply      
Un-ask the question.

Please first explain why there should be such a thing as "applied research funding".

6
thejteam 2 hours ago 1 reply      
In the US at least there is always SBIR(Small Business Innovation Research)funding. Most of it is through the DOD, but other federal agencies includimg the NSF participate as well. The NSF grants are pretty open-ended. Come up with an idea in their broad categories and submit a proposal. it helps greatly to have a PhD on your team as the initial Principal Investgator, but at least for the DOD awards is certainly not strictly required.
7
eli_gottlieb 1 hour ago 0 replies      
Oh, someone else wants to hate on academia?
8
vshade 3 hours ago 1 reply      
The important thing is that public funded research should have public available results. I fail to see how to keep the results public available and remaining a competitive startup.
29
Smooth Voxel Terrain 0fps.wordpress.com
80 points by mariuz  11 hours ago   4 comments top 4
1
CountHackulus 6 hours ago 0 replies      
If you're still interested in realtime voxel landscapes, I'd suggest checking out this series of articles from (sadly now defunct) flipcode:
http://flipcode.com/archives/Realtime_Voxel_Landscape_Engine...
2
fredley 8 hours ago 0 replies      
This looks like the basis for an extremely cool Minecraft mod.
3
sfvermeulen 35 minutes ago 0 replies      
That was really interesting. I had no idea there were better approaches than marching cubes.
4
illicium 4 hours ago 0 replies      
Great application of WebGL.
30
Android Forums hacked: 1 million user credentials stolen zdnet.com
28 points by Empro  5 hours ago   19 comments top 6
1
vibrunazo 5 hours ago 1 reply      
Worth noting this has nothing to do with any official Android website, nor is it in any way related to google. It's just a community website made by android fans for android fans.
2
mcyger 5 hours ago 2 replies      
What's to be learned from this?

Server setting needed hardening?

Software needed updating and was vulnerable?

vBulletin is a well used and documented lice of code. I'd love to know what the security experts here think.

3
cluda01 5 hours ago 1 reply      
What's the significance of this? Upon cursory glance it seems like a community site for android developers. Am I missing something?
4
joekrill 4 hours ago 0 replies      
In all fairness, they appear to have handled this incredibly well and have been very informative. Which is much more than can be said for the breaches in most other cases. And at least the passwords were hashed (although how well, they don't really say -- I guess that would be part of the vBulletin package?)
5
Xavura 4 hours ago 1 reply      
This has been happening a lot lately, wasn't there something with Yahoo! just a few days ago? And I recall one or two others not long since.
6
da_n 5 hours ago 0 replies      
But isn't Android open?

(sorry, that was terrible).

       cached 13 July 2012 01:02:01 GMT