hacker news with inline top comments    .. more ..    23 Dec 2012 News
home   ask   best   6 years ago   
Dangling by a Trivial Feature dadgum.com
18 points by nickmain  1 hour ago   4 comments top
guard-of-terra 31 minutes ago 2 replies      
That's why you need settings in your tools.

The tool should work efficiently without you ever looking at the settings, but if you need it you should be able to change anything you like.

Not sure it applies to consumer interfaces. Ideally it shouldn't. But for tools that you use to do day-to-day work, absolutely.

Non-Technical Questions To Ask When Hiring A Development Firm mojotech.com
24 points by kishfy  2 hours ago   7 comments top 4
mansoor-s 9 minutes ago 0 replies      
"Can I adjust my feature set and specifications as we go?"

Client: "Now, this TODO list app is great, but we want something more like Facebook and Linkedin combined"

This is exactly how you get feature creep and is a really bad idea for fix bid projects. But of course it depends on how big a pivot it is and how big a contract it is.

"Can I meet and work directly with the designers and developers on my project?"

Sure, you can meet and get advice from them, but you may not work directly with them. There are some pretty crazy clients out there, and it is the management's job to keep them from driving developers insane. That is why management gets the "big bucks". They have to deal with the crazy.

ianpri 1 minute ago 0 replies      
Some of this is awfully bad advice - why is it useful to work directly with multiple designers/developers rather than a single point of contact/project manager? isn't this pushing the project management onto the clients side?

As a developer do I really want the client being able to dominate all my time when I have multiple projects on the go at the same time?

nkuttler 5 minutes ago 0 replies      
I think the points raised in the five questions would mostly come up in underfunded projects. Sure, you can have everything you're willing to pay for. Especially things like

> Can I adjust my feature set and specifications as we go?

bybjorn 29 minutes ago 1 reply      
"Work for hire"? .. no thanks
Where the logic hides in rails apps github.com
10 points by dogas  1 hour ago   4 comments top 2
0x0 27 minutes ago 1 reply      
These kinds of callbacks, which remind me about the "aspect oriented programming" that was hyped for a short while a few years ago, look super dangerous.

I could easily imagine someone unaware of this hook running a test to create a bunch of user entries, sending emails all over the place without even realizing.

It's like someone read http://en.wikipedia.org/wiki/COMEFROM and took it seriously.

graywh 9 minutes ago 0 replies      
Avdi Grimm has a "book" on separating business logic from Rails - http://objectsonrails.com/
Monads in Python (with nice syntax) valuedlessons.com
42 points by llambda  4 hours ago   14 comments top 7
dustingetz 2 hours ago 1 reply      
i've worked through this article a couple times and his implementation never really sat well with me, the "with nice syntax for monad comprehensions" bit really obfuscates the implementation.

here's python source to a monadic lisp interpreter, which i wrote to follow along the paper "monad transformers and modular interpreters" [1]. i think this is a much simpler implementation of python monads than provided in the ValuedLessons article. https://github.com/dustingetz/monadic-interpreter

Study of this implementation will teach you why nobody actually uses monads in python for non-toy projects. A literal port of this code to clojure would feel so much more idiomatic and not hacky at all.

here are some smaller code dumps demonstrating the fundamental concepts to that monadic lisp interpreter:

http://www.dustingetz.com/2012/10/02/reader-writer-state-mon... http://www.dustingetz.com/2012/10/07/monads-in-python-identi...

[1] http://haskell.cs.yale.edu/wp-content/uploads/2011/02/POPL96...

almost 2 hours ago 0 replies      
Can anyone explain to me how the continuation Monad is working there? As far as I know there's no way to return from the same yield twice, which is what a callcc would need to do. I'm going to carry on trying to puzzle it out but a pointer would be helpful!

EDIT: I just tried out the code and it doesn't support multiple returns. Isn't that pretty much the thing that defines the Continuation Monad or am I just not getting it?

klibertp 52 minutes ago 0 replies      
I went through this article some time ago and, while it's certainly neat implementation and worth understanding (that was the first recursive generator that made sense to me - thanks to this I had much easier time understanding Scheme's call/cc and it's uses), it needs to be noted that it is not a generic monad implementation. You can not, for example, implement list monad (I tried very hard and couldn't, I saw the same said in comments below the article, but if it's possible please let me know!) using this implementation and syntax, because you can not "rewind" Python's generators.

So, while it's nice implementation for some kinds of monads it's not nearly general enough to be (IMHO) called "monads in Python" - it is possible to implement fully general monads in Python, but you need nested lambdas and there is no getting away from it.

Still, very nice hack and worth spending an evening (or two) to understand how it works.

freework 2 hours ago 3 replies      
This is missing the one thing that all other "let me explain monads to you" articles is missing: a real example of what problems monads solve. As far as I know, a monad is some kind of mathematical construct that is very useful when solving mathematical proofs, but has no use for the every day programmer. The examples in these kinds of articles are always completely abstract. You can very easily describe a database by using the "customer/phone number" example, yet I've never seen such an example with monads.
crucialfelix 3 hours ago 1 reply      
The problem with Monads is that the people who understand them are the least qualified to explain them to other people.
hxseven 4 hours ago 0 replies      
BTW: This was already submitted to HN.


But that doesn't make it less interesting ;)

temac 3 hours ago 1 reply      
This is very interesting, very intelligent. Indeed this is too much intelligent. There is a huge amount of highly valuable programmers that won't be able to understand all that mess, and the gain is questionable at least given that there are alternate constructs that are way better compromise between expressiveness and maintainability.

Use the right tool in its intended way: don't try to retrofit constructs in languages were they don't fit, just for an accessory characteristic. (Every line of an imperative language is conceptually kind of a monad, so the article here is just trying to convince us that yet another form of hidden goto is good, by an ad-hoc extension in a language were more than half of monad necessity is missing)

GitLab 4.0 Released gitlabhq.com
78 points by andrewmunsell  6 hours ago   36 comments top 8
josscrowcroft 1 hour ago 0 replies      
This might be the push I need to install GitLab for my biz and site deployments, and bringing on contributors... until now, it seemed like too much hassle when I can simply pay $7/mo to use GitHub mini plan.

Looks fantastic though congrats.

n0nick 4 hours ago 4 replies      
GitLab looks awesome, both features and design wise, but the installation process is so tedious and complicated that I quit in the middle of it more than once:

This is feedback I heard from fellow developers as well.

I'm curious as to whether the maintainers are planning to simplify the list of dependencies and installation steps in the upcoming versions.

MartinMond 2 hours ago 1 reply      
GitLab is incredible. I just read the documentation on setting up databases: https://github.com/gitlabhq/gitlabhq/blob/master/doc/install...

So it now supports both MySQL and Postgres, even though MySQL is preferred. Does anyone know what that means for a Postgres guy like me? Does "MySQL is preferred" mean that it has known bugs on Postgres? Is it usable?

mikhailt 51 minutes ago 1 reply      
Does anybody know if GitLab does limit access to repos for certain people while the rest have unlimited access?

An example, CS team have access to issues/wikis only while the dev team have unlimited access.

This is our only deal breaker with GitHub.

If you know of other git-based system/service that does this, please let me know.


sippndipp 24 minutes ago 0 replies      
We're using GitLab for 3 month now and it's an amazing project. Kudos to everyone who helped!
mountaineer 1 hour ago 0 replies      
I've been looking at options for Git repos inside the firewall so happy to see this. Has anyone used Atlassian Stash too? How does it stack up vs GitLab? I've been using GitStack for a few months, which hasn't been bad for the basics.
rustc 5 hours ago 2 replies      
GitLab seems to be getting better and better, every month.

To anyone who is using this, how would you compare 4.0 to the current GitHub, in terms of UI/Features?

I'm thinking of moving from a third party host to GitLab. What should I be aware of?

daniellockard 2 hours ago 0 replies      
Man, the feature that I've wanted for a LONG time is in this rev of GitLab. Groups / Namespaces and permissions based on those groups.
Whatever happened to the Hurd? " The story of the GNU OS linuxuser.co.uk
11 points by vorbote  1 hour ago   1 comment top
sneilan 1 minute ago 0 replies      
It sank like a turd.
Page Weight Matters chriszacharias.com
526 points by zacman85  19 hours ago   119 comments top 25
marcamillion 13 hours ago 4 replies      
This story reminds me of a founder I met recently, building a wildly successful project on the most meager of resources (continue reading, I include Google Analytics screenshots below).

This is a hacker in the truest sense of the word. He builds stuff just because he loves building stuff. He is a student at a University here in Jamaica. He isn't building it to be cool, or chasing the latest web 2.0 fad of the week. He didn't know HN until I introduced it to him, and he is young (say 19/20).

He built http://wapcreate.com - a WAP site creator.

That's right, WAP...not iOS or Android optimized HTML5 sites. Good, old fashioned WAP sites.

The most amazing part of the story though is that he is running it on 2 dedicated servers in Germany, it's a hack job (PHP, a bit of Ruby & a bit of Java). But once he picked up traction, he got so much traffic that he hasn't been able to keep the servers online.

In the first image - http://i.imgur.com/yEbyh.png - you will see that he got over 1.5M uniques. The vast majority of the time period covered here (the last year) was either very low traffic - pre-traction - or servers offline due to excessive traffic.

In the 2nd image - http://i.imgur.com/Pu8da.png - you will see that about 1.2M of those visits were in the 3 month period of June 1st - Aug 31st. His servers melted down towards the end of August and he ran out of money to pay his server bills. He eventually got it back up again a few weeks later, and the traffic spiked again and the servers crashed a few weeks later again.

In the 3rd image - http://i.imgur.com/HJ4gy.png - you will see that the vast majority of the visits are from Asia (even though he is 1 guy in the rural areas of Jamaica).

In the 4th image - http://i.imgur.com/JSQ48.png - and perhaps the most striking you will see the diversity of devices that the visitors are coming from. Most of them are from "feature phones". i.e. A multitude of versions of Nokia phones. Notice that this is just 1 - 10 of 508 device types.

He presented at a conference I went to, here in Jamaica, and he and I started speaking. I am helping him figure out how to proceed in a sustainable way. i.e. getting this thing stable, and then generating revenue.

After speaking to him for many weeks, I finally realized how insane his accomplishment is. Apparently, in all of this, he had been creating his site on computers that were not his. He either used his school computers, or borrowed machines from people. His Aunt is buying him a 2nd hand Thinkpad for Christmas - for which he is EXTREMELY stoked.

So while we are all chasing the billions doled out by Apple on the App Store and the newest, sexiest SaaS app idea with fancy $29/mo recurring revenue, with our cutting edge macbook pros and iPads - here is one guy using borrowed hardware, making a creation app for a technology that we have long since forgotten, generating crazy traffic and usage and struggling to even make a dime from his creation.

The world is a funny place, and this internet thing that we live on - is massive. As big as TechCrunch & HN are, there is so much more out there.

If you think you can help out in any way, either donating computing resources or anything else that can help us get this site back online and helping him start to generate revenue from this - then feel free to reach out to me.

P.S. If you want to dig into it some more, check out what some of the fans of the site are saying on it's FB page. I am not trying to trick you into liking the page. It only has ~1500 likes, but you can see passionate users commenting (both complaining about the downtime and praising some of the features).

jerrya 18 hours ago 6 replies      
I'm on a pretty fast line, and I am sometimes appalled with how long it takes pages to load, and often it comes down to loading content I have no interest in:

  * comments
* ads and images
* headers
* sidebars
* toolbars
* menus
* social network tools
* meebo bars (really google, really?)
* javascript to load all of the above crap

The amazing thing is how often I can't even begin to read the page for what seems like much of a minute as the page takes so long to render, or various pieces of the page jump, and shift and scroll.

I find that tools that localhost various ad servers help, and other tools that load the crap but keep it off the page help like adblock plus, but even more so, adblock plus' filters that let me shitcan all the crap.

One of these days I want to write an extension similar to adblock plus that seeks out and removes jquery crap. A lot of the reasons I can't read pages anymore seems to be jquery slideshows, jquery toolbars, jquery popups and the like.

I am pretty sure that graphing this out and we find the end of the web occurs sometime in 2018 when page designers and their bosses and engineers and marketing pukes have so larded down pages that the net runs out of available bandwidth and any page takes 4:33 to load.

nikcub 10 hours ago 1 reply      
There are a few ways to see this data in Google Analytics. You can view a world map of your site with page load time and server connection time color-coded.

The first, and easiest way is to go to Content > Site Speed > Overview. By default this will show you a chart of page load time over time.

First, to get enough data change the time scale to a full year. Underneath the date picker there is an icon with 16 dots in it in a 4x4 arrangement, with some dots filled in. click on that and move the slider all the way to the right. This will ensure higher precision and will capture some of the slower page loads.

At the bottom, in the 'Site Speed' section instead of 'Browser' select 'Country/Territory'. It will change the data from pages to countries. Now click on 'view full report' and you will get a world map with page load times.

It will look something like this:


The site I just did it on doesn't have enough data, but if you have a fairly popular site you should see a nice variation in page load times.

Google have a post about this on their Analytics blog with much better maps and more information:


Their maps look a lot better.

The other way of doing it is to create a custom report. I have one called 'Global Page Load'. Add 'Country/Territory' as a dimension, and 'City' as a sub-dimension. You may also drill in further by adding browser version, mobile, javascript, etc.

As metric groups, I have - in this order: DNS Lookup time, Avg Server Connection Time, Avg Server Response Time, Avg. Page Load Time.

This then gives you a pretty report where you can immediately see which visitors are getting slow responses, and you can further drill in to see what type of connections and which browsers or devices are slow. I was surprised that my light page, with compressed CSS and every static/cached was still taking ~20seconds to fully load from 30% of countries.

That is for pages that load, you need to add another tab to the report with error metrics to see who isn't getting through at all, and you would need to look at server logs to see who isn't even loading the Google Analytics javascript. All very handy, and eye opening in terms of the types of connections and the speed of connections that a lot of web users are on.

To many sites are guilty of having pages that are just far too heavy - like they only test from their 100Mbit city based connections. I am in Australia with a 1st world internet connection at 24Mbit and I avoid theverge.com desktop site because of the page load.

Edit: If anybody can work out a way to share custom reports in Google Analytics, let me know - I would be interested in sharing reports with others, for specific cases such as this.

rwg 11 hours ago 4 replies      
Since my home router and wireless access points all run Linux, I can play all kinds of sadistic games with routing, tunneling, traffic shaping, etc.

One of the experiments I did a while back was creating a "satellite Internet connection from halfway around the globe" simulator on a dedicated wireless SSID. Basically, I created a new SSID and used Linux's traffic control/queueing discipline stuff to limit that SSID's outbound throughput to 32 kbps, limit inbound throughput to 64 kbps, add 900 milliseconds of latency on sent and received packets, and randomly drop 4% of packets in/out. Very, very few sites were even remotely usable. It was astonishing.

I think one of the most useful products that could ever be created for web developers is a "world Internet simulator" box that sits between your computer and its Internet connection. (Maybe it plugs into your existing wireless router and creates a new wireless network.) It would have a web interface that shows you a map of the world. You click a country, and the simulator performs rate shaping, latency insertion, and packet loss matching the averages for whatever country you clicked. Then devs can feel the pain of people accessing their websites from other countries.

(Thinking about this for a minute, it could probably be done for about $30 using one of those tiny 802.11n travel routers and a custom OpenWRT build. It would just be a matter of getting the per-country data. Hmmmmm...)

aaronjg 16 hours ago 2 replies      
This is an interesting example of why randomization in experiments is important. If you allow users to self select into the experiment and control group, and then naively look at the results, the results might come up opposite from what is expected. This is known as Simpson's Paradox. In this case, it was only the users for whom page load was already the slowest that picked the faster version of the page. So naively looking at page load times made the pages look like they loaded slower.

However once Chris controlled for geography, he was able to find that there was a significant improvement.

Moral of the story: run randomized A/B tests, or be very careful when you are analyzing the results.

morpher 17 hours ago 2 replies      
In addition to the developing world, page weight is a big concern for those of us with limited data plans. When a single log post weighs in at 2-3 MB (which is common), the day's allotment of a 200MB/mo plan disappears really fast.
I often end up only reading the HN comments while on the bus since I don't want to play Russian roulette with the article size.

(of course, this is unrelated to YouTube, but to the general sentiment of the article)

dsr_ 7 hours ago 0 replies      
All of you people pointing out that AdBlock and NoScript and using w3m really helps reduce load times:

you're right. And missing the point.

A large chunk of the world is permanently bandwidth starved, and most of the mobile world lives on transfer caps and long latencies. If those are plausible scenarios for your audience, you need to reduce the quantity of invisible bits you are sending them.

What's an invisible bit? Any bit that does not directly create text or images on the screen is invisible. Anything that runs client-side. Some of it may be necessary, but most of it is probably boiler-plate or general-purpose when what you actually need is more limited. Reduce!

habosa 18 hours ago 4 replies      
I like this article a lot, it makes a good point that many of us often overlook. What's the dominant feeling about ajax to reduce page weight? It is ok if my page is 1MB but only the initial 100KB are sent synchronously and necessary for interaction? An example would be YouTube sending just the main page layout and video player and then async loading all of the related stuff and comments.
gregpilling 5 hours ago 0 replies      
I had this argument about 4 years ago with a young colleague who said I was out of date, and I am not a developer. Brett Tabke of Webmasterworld.com has been arguing for years that lightweight matters. It used to matter for dialup in the USA, now with widespread broadband it is less important to those people. Now it is important for mobile users and global users on dialup or slow connections. In a few years it will matter to the people that got their first android phone in some country with poor infrastructure.

Even personally it matters to me :) since I live in a 50 year old house that has slow DSL (YouTube buffers constantly) and I live in the center of Tucson, AZ.

I recall Google doing a study that increasing the speed of browsing increases the use of the internet. Page weight will always matter to someone.

orangethirty 16 hours ago 2 replies      
Re: Page Weight.

I spend a bit more than a month studying how to build Nuuton in a way that it did not end up being a front-heavy piece of crap. Looked at a lot of differents options. Considered a client side Javascript MVC frameworks. Looked at the CSS side of things. Went as far as thinking about developing our own little app for Nuuton (No browser needed). But at the end, the simplest choice won. I decided to go with flat HTML/CSS. No bootstrap or framework either. Plain old markup and stylesheets. It uses very little Javascript (no Jquery either), just what the simple UI requires (an event listener for the enter key, and an event listener for 4 links). Nothing else. No animations. No fancy scrolling. No caroussels. Nothing like that. Its rendered on the server side (Django), and gets served to you in a very small package. I even went as far as breaking down the stylesheets so that you dont need to download CSS that does not apply to the page you are visiting. And you know what? It works, feels, and looks amazing. Even if the project ends up being a failure, acheiving such little victories are just deliciously fun.

Where did I get the inspiration to go against the current industry trends? HN's simple yet functional html setup. My god, its features a million tables, but the damn thing just works beautifully. By the way, Nuuton also uses tables. :)

keithpeter 9 hours ago 2 replies      
I live less than one and a half miles from the centre of a major European city. We currently get 0.24Mbits/s download and 0.37Mbits/s upload on ADSL over the phone line. My broadband provider's help line have confirmed no faults on line, or with my router (I've swapped the cables, router, and tried several computers). They think its an automatic profiling change because we are often away from home and switch the router off. I'm battling my way through the help line 'levels' which is proving to be 'interesting'. I may go to a smaller more expensive provider in the new year just so I can run an rdp session and get an interface I can work on.

The low bandwidth experiment has been educational. On Firefox/Ubuntu, you get the little status bar at the bottom that shows the requests. Some pages have a lot of those, and take ages to load. Distro-hopping is feasible (I'm trying out different interfaces), a CD-ROM downloads overnight quite easily. Software updates are a killer (go and have dinner, listen to some music...).

I've started using a command line web client (w3m in my case) to get to the text more quickly. You get to see which sites have 'skip to content' links embedded early in the page and with CSS set to not show them on Firefox. You also get to see which sites provide all content via html and which sites 'wrap up' their content in javascript functions &c.

As many here provide Web content and applications, just try a command line Web browser on your site...

plouffy 18 hours ago 1 reply      
I must not be very clever as it took me 3-4 attempts to understand the before last paragraph (e.g. that without Feather it took 20 minutes while with Feather only 2 minutes to load a video).
ghshephard 18 hours ago 8 replies      
While I appreciate (and agree with) the overall theme (Make your pages light, so people on < 10mbit/second links have access them, opening them up as a market to you - this works well for people on older GPRS/GSM links in the developing world) - I don't understand the math. Presuming there are parts of the world still stuck in 1997 analog modem technology, then 100 Kilobytes / 56.6 kilobits/second = 14 seconds.

At two minutes, there are people out there with 6.6 kilobit links connecting to youtube?

I suspect there might be a bit of hyperbole in this article as well, because, even if there were people connecting on ultra slow links, the average latency of those connections is likely to be be wiped out by the tens of millions of broadband links.

tomfakes 14 hours ago 0 replies      
There's a great resource for testing page load speed for different browsers/locations/connection speeds.

Frankly, I'm stunned that they run this at no cost.


cedricd 3 hours ago 0 replies      
This post makes a ton of sense. I spent a year traveling in the developing world a couple years back and was literally unable to watch a YouTube clip the entire time. It's amazing what parts of the Internet you can't see the moment you're on an insanely slow connection.

I was in Myanmar for a bit and their internet was so slow that I couldn't check even news or email -- no need for a China-style firewall.

KMBredt 9 hours ago 0 replies      
Feather even exists today: http://www.youtube.com/feather_beta and works as far as I can tell. You can opt into it using the link.
drivebyacct2 13 hours ago 1 reply      
Can someone with data give me a comparison of the people with the slowest connections and which browsers they use?
ricardobeat 18 hours ago 1 reply      
> places like Southeast Asia, South America, Africa

Large swaths of South America have broadband access. I guess South Africa does too. Being a little more specific would be helpful.

grego 11 hours ago 0 replies      
Indeed, many who extol the virtues of cloud computing somehow have a blind spot to the fact that network coverage on this globe is not even. Actually it would be nice to see a color coded world map of network strength, is there one?

Bad network coverage is a problem that can be addressed many ways. Shameless plug: I've been working on offline wikipedia that by its nature is always available and fast, everywhere: http://mpaja.com/mopedi I'm sure there are many unexplored niches where the cpu power in your hand can be put to good use without needing the network.

Eliezer 14 hours ago 0 replies      
Simpson's Paradox ftw!
winter_blue 15 hours ago 1 reply      
Oh, I totally know it does because I've been stuck on and off on slow connections recently -- and a lot of pages (HN is big and awesome exception), load quite slowly. I even ended up using gmail's HTML interface for a while.
spyder 10 hours ago 0 replies      
What about compression? Is he talking about gzipped size or the uncompressed page?
dainix 10 hours ago 0 replies      
We are working on the same thing, get site as lightweight as possible! Oh man, thinking that for somebody site takes 20 min to load makes me cringe!!!
davars 18 hours ago 1 reply      
So the code to load the videos came in under 100KB. How large were the videos themselves?
jarcoal 18 hours ago 1 reply      
I'm not really sure this post provides any useful information.
Mumford & Sons Warn Against 'Unauthorized Lending' of Their CD wired.com
8 points by rosser  1 hour ago   3 comments top 3
meaty 1 minute ago 0 replies      
Storm in a teacup will result in more bittorrent seeds...

More seriously: my kids school are always using their smart whiteboards to play YouTube videos and DVDs on, particularly on pre-Xmas slacking week (the one that the teachers think everyone is entitled to in the UK). How does this fair with such terms? Would they sue a school?

amadeus 11 minutes ago 0 replies      
I think one of the first comments on the Wired article sums this all up pretty well:


TL;DR: The term lending has been on most records in the last 20 years and actually refers to public rentals, or something to that effect, and has nothing to do with letting your friend borrow your CDs.

druiid 16 minutes ago 0 replies      
Oh good. Another reason I can reasonably dislike this band. I find them to be a poor version of one of my favorite bands (The Avett Brothers).

I would think that having more people hear your music would be a good thing. Controlling or attempting to control whom hears it smacks of echoes of the Metallica 'Down with Napster' stuff. Either enjoy people listening earnestly to your music, or well, you're just a really interesting marketing project and not musicians IMHO.

Game of Thrones Most Pirated TV Show of 2012 torrentfreak.com
14 points by Pr0  2 hours ago   12 comments top 3
michaelochurch 7 minutes ago 0 replies      
Honestly, I don't have a major moral problem with piracy. It's the lowest tier of price discrimination. The vast majority of people who pirate would simply not pay for the content if piracy were an option. There are content creators who encourage piracy because they have a better chance of becoming viral (and making more money) if it occurs.

I pay, when I can. I have enough money, and if I pirate, then I'm disenfranchising myself because, in pop culture, money is a vote. If I don't pay (vote) then I can't complain about garbage being produced because I'm a non-contributor. Piracy was OK when I was a college kid with very little money, but now that the cost of content is trivial in comparison to the time it costs me to watch something, I feel like I should take the legit route.

However, I don't buy cable. It's too expensive given that most of the channels I'll never watch, and Time Warner Cable is the epitome of Suck. Why should I pay so much for such terrible service? I am not going to "vote for" TWC just to watch Game of Thrones, which only requires a cable subscription because HBO was beaten into submission by the bad guys.

So I say: until HBO will take your money directly, pirate on.

dlss 52 minutes ago 3 replies      
"Piracy is almost always a service problem" - Gabe Newell

I understand that HBO has some sort of proprietary online video watching service, but speaking as someone with no TV and no desire to pay for basic cable merely to enable my paying for HBO, I do with HBO would take Gabe's quote to heart.

azakai 15 minutes ago 1 reply      
Hmm, the torrentverse has surprisingly good taste,

> Game of Thrones

> Dexter

> Big Bang Theory

> How I Met Your Mother

> Breaking Bad

> The Walking Dead

> Homeland

Well, except for the last one ;) I kid, I kid.

Central bankers rethink their devotion to slaying inflation reuters.com
7 points by mtgx  1 hour ago   3 comments top 2
barking 38 minutes ago 1 reply      
Inflation isn't a dirty word to me.

It would be really good for me and people like me.
I have a mortgage that's taking a huge chunk out of my income and will do for the next 20 years while I'd get about half what I paid if I sold the house now.

Inflation would be bad for the people living off their savings, generally the elderly.

But then I think they in turn benefitted from an inter-generational wealth transfer in the 70s when there was double digit inflation for most of the decade.

pebb 15 minutes ago 0 replies      
Everyone will be a billionaire soon!
Ed McMillen: Ubuntu Store Sold Super Meat Boy Without Permission, Has Yet To Pay kotaku.com.au
170 points by kotakufanb  10 hours ago   68 comments top 13
jacquesm 10 hours ago 11 replies      
Pirates is way too heavy a term here. Canonical is large enough and the issue muddy enough that I don't think this is a warranted accusation. They're also shooting themselves in the foot, instead of coming to some kind of amicable arrangement they've now accused canonical of a serious crime which will lead to a response to that accusation rather than to a solution that would have been beneficial to all parties.

If you deal with a company that is much larger than yours that made a mistake or did something you don't agree with publicity is a means of last resort, not your first avenue for redress. And if you truly believe canonical pirated your game then you should sue them.

This is an excellent reminder why I prefer open source to closed source, projects like Arch and Debian would never suffer from this.

dpitkin 1 hour ago 1 reply      
Hi, I just looked into it and the check to Tommy and Edmund from Canonical is in process for the 77 copies of Super Meat Boy. We have been working together since November to get it resolved, no piracy here just some miscommunication.

David Pitkin

evmar 1 hour ago 0 replies      
I tried for nearly a year to get Google Chrome into the Ubuntu store. Each time I asked about it I got answers like "the relevant person is on vacation", "we'll get back to you", "sorry, forgot to respond". I eventually gave up. While I remain cranky about it I try to believe they are just not well organized and it wasn't anything malicious, which I hope is also the explanation here too.
jiggy2011 9 hours ago 0 replies      
How does one apply for a job doing PR for canonical?
I'm pretty sure even HN people could do a better job.
SquareWheel 9 hours ago 2 replies      
Please just link directly to the FormSpring rather than the blogified version. It gives all the relevant info.


huhtenberg 3 hours ago 0 replies      
7 hours into the discussion and no link to "Indie Game The Movie" movie that follows SMB devs for few months prior to the launch?



shardling 9 hours ago 0 replies      
The word "pirates" does not seem to appear in the linked article.
saosebastiao 3 hours ago 1 reply      
This is typical of the behavior of companies that are trying desperately to stay afloat.
antidoh 7 hours ago 0 replies      
They just run things on a net-365 basis is all.
Tichy 8 hours ago 0 replies      
Too much drama. I'm sure this will be sorted out.
ma2000 8 hours ago 7 replies      
While watching the video/trailer for the most recent Humble Bundle, I was put off by the games this time around.

The one mentioned in this article for example, looked like a basic Flash game. So as an outsider to this indie gaming world, I can understand why they might have superficially rejected it - it's probably a great game if you give it a chance.

Just thought that someone should point this out because the comments have so far been the opposite.

chris_wot 10 hours ago 1 reply      
Time to lawyer up!
VMG 5 hours ago 2 replies      
I thought piracy is cool now? Oh wait, it's only cool when the underdog does it. Nevermind.
FBI Documents Reveal Secret Nationwide Occupy Monitoring justiceonline.org
48 points by mtgx  5 hours ago   20 comments top 5
DanielBMarkham 3 hours ago 5 replies      
What kind of idiocy is this? This is the FBI's job -- to monitor both foreign and domestic groups of national scope that might present some kind of threat to civil order.

They're not like the fire department, where they sit around waiting for something to happen. They're supposed to get out there and get proactively involved in all kinds of things from white supremacists to greens.

As a libertarian I enjoy a good rant about state security as much as the next guy, but I prefer to do so from an informed position. There's enough real things to worry about without going on about the FBI doing what they're supposed to be doing.

w1ntermute 4 hours ago 0 replies      
Is anyone surprised at all by this? It's not like this is new or something. Despite the various documented incidents of physical violence and intimidation inflicted by law enforcement officers on Occupy protesters, it comes nowhere near the sorts of horrors suffered by protesters during the Civil Rights Movement, for example. Especially in the South, where protesters were often being attacked by the police.
dquigley 3 hours ago 0 replies      
The presumptuous labeling is definitely questionable, but the coordination and communication seems applaudable.(for a government organization)

If there was a movement of people planing to protest Google or Facebook, I would expect the FBI to warn them if they had solid information it was going to happen. In fact if they were aware of large scale protests again an all but convicted child killer, they still have a responsibility to inform and protect. We protect criminals and saints equally in this country.

Second in my mind there is no question that on both "sides", police and protesters, individual people broke laws. Protests bring out the worst in some of the police officers under pressure and some of the protesters.
So the FBI and the agencies they coordinated with would have been failing at their job to not monitor and report in an effort to protect the employees of the businesses.

You might not like the protection big banks got but they should receive it. Just like the most heinous criminal receives a lawyer to defend themselves, access to protection from danger (vigilanties), etc

So if we can step the emotions back a bit and use a critical eye on both sides of the protests I think we will see a FBI that jumped to conclusions but did their job.

And finally, I find it surprising in a start up forum that promotes agility and a lack of bureaucracy as the ideal that we are so quick to suggest more of it to an already bureaucratic, slow government.

derekerdmann 4 hours ago 1 reply      
Of course they did. Let's ignore for a moment that protest groups (like PETA) are usually considered some of the highest risks for terrorism; it would be irresponsible of the bureau to take the protest at face value and assume they have completely peaceful intentions.
nakedrobot2 4 hours ago 2 replies      
How much power do these scary bureaucratic agencies FBI, CIA, NSA actually yield in comparison to our elected politicians, anyway? Do they have any real accountability?
The Sex Tape Litmus Test laverick.org
82 points by dustin  7 hours ago   68 comments top 10
dsr_ 5 hours ago 3 replies      
The primary function of a legal department is to provide advice that prevents legally actionable mistakes.

This advice does not have to be sane, or efficient, or indeed have any consideration towards the interests of the company other than "prevents legally actionable mistakes". A few days ago HN saw an article about setting goals and perverse incentives. This is a simple example.

Hypothetically, someone was reviewing the Sony USA employment contract and saw that there were, perhaps, non-video-game related developments which might be valuable. Then they asked the legal department "Please supply contract terms that give us as much as possible." And after an hour or two of research, they did.

The surprising thing to me is that they tried to change language for existing employees out of cycle. If they did it during a regular review cycle, even fewer people would have noticed.

redcircle 3 hours ago 3 replies      
California has a nice law that says that the employee owns IP created on his own time, unrelated to work, etc. This is a reason I moved to CA: the state I was in lacked such a law, and all the tech companies had contracts that failed the sex tape test. I suspect that for a place to flourish like Silicon Valley, it needs a law that protects self bootstrapping startups.
guard-of-terra 5 hours ago 6 replies      
I think that any funny clauses in the contracts should be abolished and the worker/employer relations should only be regulated by law.

This makes me pro-regulation and anti-market, but unfortunately I see exactly zero ways in which market can make contracts better. What are you expected to do in this situation - quit?

mgkimsal 4 hours ago 1 reply      
I've brought this up as the "child porn" clause, and had brought it up in an employment contract once many years ago. Basic language was "we own anything you create". I said "I don't really think you want that - if I create some child porn, you're the owners". I seem to remember I had some less restrictive language placed in my contract vs that one, but I don't think it made a change to anyone else's contracts.

Yes, it felt a bit 'nuclear' dropping such a charged statement like that, and even when I bring it up as an example in conversation, some people cringe - a 'sex tape' analogy might be less offensive to some, but the basic premise still stands. Any company that wants to claim ownership of every piece of content or code I 'create' needs to understand what that really entails. It might actually give some people license to work on legally questionable stuff (not child porn so much as, say, banned crypto), knowing that they don't really 'own' it and thinking someone else might be responsible for the consequences.

nakkiel 4 hours ago 1 reply      
IP assignment is the only thing I really negotiate when I take on a new position. I had one company back-pedal as they were trying to change their terms from nothing to we own anything you make, any time. I had the CEO/CTO of another company write in plain English that anything done in my free time and without using company-owned facilities/hardware was my own IP (their legal bla-bla was unclear).

In the first case, the corrected terms got applied to everybody in the company but in the second, I believe I'm the only one who is protected thank to that written note.

I always use the analogy of an English teacher writing a book on his spare time. How he would actually be encouraged to do so, weighting how this would reflect nicely on the school he works at etc..

jasonjei 3 hours ago 0 replies      
IANAL, but I have heard of something called the Reasonable Person Standard. Since the US is based on Common Law, I believe this standard could be used if this were really tested in court:

``The reasonable person (historically reasonable man) is one of many tools for explaining the law to a jury.[1] The "reasonable person" is an emergent concept of common law.[2] While there is (loose) consensus in black letter law, there is no universally accepted, technical definition. As a legal fiction,[2] the "reasonable person" is not an average person or a typical person. Instead, the "reasonable person" is a composite of a relevant community's judgment as to how a typical member of said community should behave in situations that might pose a threat of harm (through action or inaction) to the public.[3]
The standard also holds that each person owes a duty to behave as a reasonable person would under the same or similar circumstances.[4][5] While the specific circumstances of each case will require varying kinds of conduct and degrees of care, the reasonable person standard undergoes no variation itself.[6][7]
The "reasonable person" construct can be found applied in many areas of the law. The standard performs a crucial role in determining negligence in both criminal law"that is, criminal negligence"and tort law.
The standard also has a presence in contract law, though its use there is substantially different.[8] It is used to determine contractual intent, or if a breach of the standard of care has occurred, provided a duty of care can be proven. The intent of a party can be determined by examining the understanding of a reasonable person, after consideration is given to all relevant circumstances of the case including the negotiations, any practices the parties have established between themselves, usages and any subsequent conduct of the parties.[9]"

bretthardin 2 hours ago 0 replies      
While I was working at Earthlink in 1999, they had a similar clause and I had a similar thought.

Although not around a sex tape, I though about a computer virus released from my Earthlink corporate email account. If I sent it out the virus technically belonged to Earthlink and not me. However, after talking to a lawyer about it years later, he explained there is ways the corporation could get out of the clause.

btilly 1 hour ago 0 replies      
Note. The contract, as described , would not hold for California employees.

See http://www.leginfo.ca.gov/cgi-bin/displaycode?section=lab... if you don't know what I'm talking about.

(That said, Sony probably does enough different things that the difference does not matter much to most people.)

matt2000 1 hour ago 1 reply      
In case anyone is in a position to hire programmers and cares about treating creative people fairly, we have an open source Hacker Employment Contract: https://www.docracy.com/hackercontract that tries to fairly handle stuff like IP created after hours.
gnu8 40 minutes ago 0 replies      
This story is missing the best part, which is where the compliance officer bluescreens after being asked about whether the sex tape would be owned by Sony.
An application of Linear Programming in Game Theory alabidan.me
16 points by alabid  3 hours ago   3 comments top 2
gweinberg 3 minutes ago 0 replies      
"Note that this doesn't mean that Daniel will always lose the game but that he can lose by at most 1/12 the value of the game. If Nick doesn't play optimally (Nick doesn't use his optimal mixed strategy), Daniel will most likely win!"

I don't think this is correct. I think if Daniel plays his optimum strategy, Nick will get the same payoff no matter what he plays.

I think this is a fairly general result, if one player is playing the optimal strategy, once the other player has eliminated options he should never play, it doesn't matter how his choices are distributed among the remaining options.

zissou 2 hours ago 1 reply      
Always happy to see some game theory on HN. If you're looking for a good book that focuses more on how game theory is actually used in practice versus the more computational exposition here, then I'd recommend a very readable and cheap book called "Game Theory for Applied Economists" by Robert Gibbons (Google Preview: http://books.google.com/books/p/princeton?id=8ygxf2WunAIC). The book has only 4 chapters which cover the 4 different types of games:

1. Static Games of Complete Information

2. Dynamic Games of Complete Information

3. Static Games of Incomplete Information

4. Dynamic Games of Incomplete Information

This segmentation covers all possible types of games. It's great because then you only have to decide if the game is static vs. dynamic and whether it's a game of complete vs. incomplete information (remember, perfect/imperfect information is not the same as complete/incomplete information). If you can answer those 2 questions, then you know what kind of equilibrium is relevant. For example, if it's a game of incomplete information (meaning that there is a move of nature, or equally, that the players don't necessarily know the types/payoffs of the other players) then you know that you are playing a Bayesian game, and hence the equilibrium (it if exists) will be some kind of a Bayesian Nash equilibrium.

You can always express a game of incomplete information as a game of imperfect information (see: Harsanyi transformation). However, here's something to think about: What do you lose when you transform a game from extensive form (a tree) to strategic form (a matrix)? The answer: Timing.

VPN access being disabled in China rendezvous.blogs.nytimes.com
39 points by ajitk  6 hours ago   12 comments top 9
austenallred 37 minutes ago 0 replies      
This sucks for travelers and ex-pats, but for China's future this is a very, very, very big deal.

I lived in Shanghai last year, and Chinese Internet surveillance is unreal. I could use gmail chat to talk about tiananman square, but as soon as I did all of my Google apps would suddenly be unavailable. I can only assume that when i used certain keywords my every chat was being monitored. A VPN was the only way I could access YouTube, Twitter, Facebook, and even some Google searches.

But reality is 90% of the young population of Shanghai didn't really care what the "great firewall" did, because EVERYONE used a VPN. I saw more people watching YouTube in China than I do in the states, even though Chinese versions of these platforms exist. Some platforms, like RenRen (Facebook-like but more similar to Russia's VKontakte) were popular, but most just used the US-built versions. Now most of them won't be able to.

This absolutely terrifies me. I was literally minutes away from being on a bullet train from Shanghai to Beijing that killed "x" people. Chinese authorities cite incredibly low numbers for a train traveling at 300 km/h. Most non-state observers cited hundreds of deaths. China slowly grew its number from 20-40.

It's illegal for foreigners to talk about the "Three Ts" with Chinese nationals - Tibet, Taiwan, and Tiananman Square. But previously the youth learned through their VPNs letting them access the outside world. With that shut down, the government might as well be burning books.

aneth4 11 minutes ago 0 replies      
I'm in Shanghai where I've lived off and on for 8 years. I've been using an ec2 image with Poptop installed. The problem is the IP addresses of the major vpns become known and blocked.

Any suggestions of software that would deploy images to various cloud services on behalf of users? I don't think China would be able to block all of ec2 and Rackspace, though they do sometimes seem to throttle ec2.

rossjudson 2 hours ago 0 replies      
Before travel to China, create a throwaway email account on a service, possibly Yahoo. Don't touch your real email accounts while you're there, if possible. The only time I've ever had an email account hacked is following use in China.
jerguismi 2 hours ago 0 replies      
Time to create Bitcoin-enabled p2p VPN market?

I have thought about the idea for some time. The marketplace operator could take something like 30% cut. Any private invidual could sell their internet connection to the chinese and earn some bitcoins in the process.

There could be some rules which could stop the chinese goverment from knowing which IP's operate in the market. For example, someone could buy certain VPN/IP address recurringly, and others couldn't purchase that specific IP - that way the goverment would have no way to know how that specific connection is used.

And of course, bitcoin isn't very easy or well established payment method - bring in the resellers/market makers from china. These could (with some easy to use software/API) resell these VPN's to the chinese inviduals.

tcoppi 41 minutes ago 0 replies      
Does anyone know of any work related to automatically making arbitrary "look" like, say, an HTTP session? I'm thinking of something that would automatically encode a VPN session as a valid, renderable HTML document (and not via the trivial way of just gzipping it and making it look like an HTTP compressed document, as I'm sure that would still be easy to block.) It seems like this should be possible, albeit with tons of performance decrease, but I can't find anything.
BrianPetro 2 hours ago 0 replies      
Here is a real test for Anonymous; take down the worlds most notorious firewall.
jasonjei 3 hours ago 1 reply      
How is Cisco IPSec affected by this blockage? Any business or foreign mission conducting transactions in China should be very wary if they start targeting IPSec in any way.
ajitk 4 hours ago 0 replies      
VPN and SSH[1] have been means of evasion. But there have been anecdotal evidence of "unstable" VPN[2] and SSH connections before.

[1] http://en.wikipedia.org/wiki/Great_Firewall_of_China
[2] http://www.guardian.co.uk/technology/2011/may/13/china-crack...

gimbuser 3 hours ago 1 reply      
It has to be very selective, otherwise it would disturb hell lot of state admins and companies :P
First Sale Under Siege: If You Bought It, You Should Own It eff.org
6 points by zoowar  1 hour ago   1 comment top
ryan_s 57 minutes ago 0 replies      
How does this currently apply to software? Specifically desktop software apps, and OSs.
Ten-year-old invents light-up crosswalks, IBM makes them real grist.org
74 points by DanLivesHere  4 hours ago   32 comments top 6
rdl 3 hours ago 5 replies      
I think we actually have those in some places in California. Not as perfectly as she's describing, but a bunch of LED bumps installed at the edges, which flash when someone's pressed a button.

Sensors and bigger stripes would be nice.

(It's a great idea, and especially if she never saw one before, shows a great thought process.)

rickdale 4 hours ago 1 reply      
Cool idea, I like that they didn't choose a random product to try an market, but an actual real world application. As a driver more than a walking pedestrian, I think these light-ups would be awesome on the road. Even during the day. I consider myself to be a good driver, but this would definitely help out.

I have to add that generally a 10 year old is not going to give you the next big app idea. 10 year olds will have no more creative ideas than you or me. But just like the lottery, it could happen.

guard-of-terra 3 hours ago 2 replies      
I am sorry but the idea is not new at all.

And the implementation is in the works - laws being passed to make those required.

jtchang 1 hour ago 0 replies      
I believe outside San francisco city hall we have a crosswalk which lights up. I believe it with with LEDs that flash. I like this on way better.

Does it turn off immediately when there is no one in the crosswalk?

revelation 3 hours ago 7 replies      
What an incredibly "out of the box" idea we have here. Sorry, I don't want to sound mean, but its a sad state of affairs when 10 year olds grow up fully accustomed to a world where most public space is taken up by two ton combustion engines. It cuts out all the vital critical thinking.

The question shouldn't be "how can we make crosswalks safer for pedestrians" but "why are there cars and roads where pedestrians walk". Otherwise we end up with things like the "bicycle lane".

easy_rider 1 hour ago 2 replies      
Couldn't some kind of luminescent paint be utilized instead?
Paul Graham may be right, but Chris Zacharias is righter danshapiro.com
75 points by danshapiro  10 hours ago   22 comments top 9
RyanZAG 8 hours ago 2 replies      
Feels a bit like this whole issue is conflicting the management team with the investors - which is obviously very common in the startup world.

Need to take a step back and look at the primary purpose of acquiring investors: taking capital from them now in exchange for future cash flows to them later. To maximize this in favour of our business, we need to take the most money for the minimum amount of equity sold.

Now look at the primary purpose of the management team: to use the assets and resources placed in their care to create the highest future cash flows possible. This includes the tasks that Chris is conflating with investors: secure deals, find the best lawyers and accounts, getting meetings with difficult people. The management team is then compensated directly for their efforts.

The above is how it works in traditional companies. The investors invest capital and decide on the management team. The management team actually runs all facets of the business. In the startup world this relationship is not as simple, as the founders are both the primary shareholders and the management team itself.

What Chris is proposing is not as outlandish as it sounds - he is proposing joining the investors as part of the management team by selling shares to them for cheaper. If a share is worth $10 on the open market, but we sell it at $5 in exchange for valuable help from investors, then we are doing something very simple: we are paying these investors to be part of the management team, and in this case we are paying them $5 per share. This is a good option to take if the skills they bring are worth the $5, and a very bad option if we could hire better/more skills by taking the $10/share and then directly hiring on the market.

TLDR: Nothing to see here - you can pay people in equity or in cash, and the choice is as difficult as it ever was.

brudgers 5 hours ago 0 replies      
The title of this article is on pitch to me. PG's comment does not come across as a "calling out." It comes across as a general conclusion based upon his experience and data.

On the other hand Zacharias's blog describes observing and responding to feelings that there was a disconnect between the funding features available to his YC class and his entrepreneurial gut.

As a "pre-cofounder" (I love the term) he was in a position to take a different course. He also enjoyed the advantage of friendships with potential investors.

I can't help but see what he describes as extending some of the fundamental YC processes beyond the point where YC kicks the baby birds out of the nest. YC works because of the trust founders place in the partners. It works because founders don't worry about the investor screwing them over, and it works because founders can spend more energy building rather than negotiating terms.

This appears to be what Zacharias did. He was careful about who he sold his company to and conscientious about the price of shares which are likely to be worthless.

It was a personal strategy - right for Zacharias. PG is right that it is poor as a general strategy for YC companies.

amirmc 8 hours ago 0 replies      
"...and get the very best of the guilder-investing angels in their round."

I can understand the reasoning but are the 'best' of this pool as good as those from the dollar-pool? Wouldn't the really good/useful ones be trying to enter the dollar-pool anyway? (NB I'm not aware of the back-story yet)

Edit: Just read both pieces and I agree that they're both right. I see this as a difference between Angel vs VC. Angels sometimes get involved because they can imagine having a useful impact on their portfolio. If an Angel takes a tiny slice of a company which also has VCs and 'YC valuations' then they may feel they have no real 'clout' or ownership in the company (not everyone wants a board seat). Even though the economic argument may be to take-whatever-you-can-get that doesn't make it fun or worth your time.

npguy 5 hours ago 0 replies      
That is true not just for angels, but for VCs as well - Elon Musk feels strongly about that - "Always Go With A High-Quality VC Even If The Valuation Is Low"


jstreebin 4 hours ago 0 replies      
I'm not sure how the leaps from "price-sensitve" to "smart money", and "price-insensitive" to "dumb money" were made. I think it's much more accurate to divide it amongst "price sensitive" and "value sensitive" investors. Yes, there's still classifications such as "dumb money" and "smart money", but they're often misused when discussing topics like this. And I've yet to see any posts on early stage val's of the biggest companies in the past few years, which could explain how "smart" or "dumb" either is. I also haven't gone through the presos of funds of both types to see who's getting the best returns, but I can remember anecdotally that both classes of investors had had success. (Kauffman Foundation would have some good data on this.)

I will say that one nice thing about pricing on the higher side is it requires more conviction on the part of your investors. Thus, it will be those that are committed to your company, despite a higher relative price. (All investors are price-sensitive, it's just to what degree.) So by pricing it higher you end up with the same result as CZ and DS are seeking, except you keep more of the company.

And as far as the price-sensitive, there are many reasons, aside from them just being "smart": a) they're trying to raise a fund and need lower valuations, "better" numbers, for potential LPs (since LPs are usually investors with a more traditional mindset on finance, and thus more price-sensitive), b) their existing investors want to see lower valuations, "better" numbers, c) they're able to get "good deals" (I'm not mocking this, I'm just noting it's a value judgement, not something concrete) at lower valuations, d) they care more about potential multiples on their own fund(s) more than investing in the outright best companies, and any number of other reasons.

Hunting for the cheapest relative startup isn't necessarily "smart" (nor is investing in say uncapped notes), and investing in "expensive" startups isn't necessarily dumb (nor is haggling over price).

mehdim 5 hours ago 0 replies      
What you talk about is a kind of "blue ocean strategy" applied to investors.

Red investors oceans represent to all the industries in existence today " the known Investor space.
In the red oceans, investors boundaries are defined and accepted, and the competitive rules of the game are known.
Here start-ups try to outperform their other start-ups rivals to grab a greater share of investment demand.
As the funding market space gets crowded, prospects for high valuation and rapid investments are reduced.
Products become commodities or niche, and cutthroat competition turns the ocean bloody; hence, the term red oceans.

Blue investor oceans, in contrast, denote all the industries not in existence today " the unknown investor market space, untainted by start-up competition.
In blue oceans, investment demand is created rather than fought over.
There is ample opportunity for high valuation and rapid investment.
In blue oceans, start-up competition for funding is irrelevant because the rules of the game are waiting to be set.
Blue investment ocean is an analogy to describe the wider, deeper potential of investor space that is not yet explored.

(Adaptation of Wkipedia article on blue ocean strategy for investment for start-ups)

In your article the "guilder-investing angels" are the blue ocean for investment in start-ups.

sskates 7 hours ago 3 replies      
I'm now curious if there are a segment of investors who use valuation to drive their decisions, even if its irrational to do so (valuation in Chris' case was a 2x factor, whereas success vs non success is a 100x one). If they exist, it may be worth targeting them.
bartwe 8 hours ago 2 replies      
Guilders have been out of circulation since the euro was introduced.
gtz56 4 hours ago 0 replies      
I think the title is backwards, it should be, "Chris Zacharias is right, but Paul Graham is righter".
GNU sed 4.2.2 released, maintainer resigns gmane.org
367 points by bonzini  1 day ago   103 comments top 14
dasht 20 hours ago 4 replies      
For a time I too was the maintainer for GNU sed. Part of that time I was paid by the FSF for that work (this was a long time ago, when the FSF had a small staff working directly on the Hurd and GNU). When I started, GNU sed was very incompatible with the relatively new Posix standard for sed. When I finished, it was less incompatible. It was during this same time that I started work on a new regexp engine for sed but that was not done by the time I stopped work on sed.

From that position I was able to observe fairly closely how the GNU project was being led, technically, in the days before the Linux kernel had had any real impact.

RMS's technical leadership was, I think, not very skilled. Let me explain what I mean:

If you were working on a program and sought his advice, he was very good at zeroing in on the issues and giving excellent advice. And sometimes if you were working on a program and he noticed something he didn't like about your approach, his criticisms were very good. People used to tell stories about how good a programmer he was and those stories were basically all true. He was sharp and I assume that, in spite of his age, he still is.

The problem was that he showed no effective capacity to really lead the larger meta project of pulling together a complete OS. He tried -- with projects like autoconf and documents like the GNU coding standards. And he kept a list of programs that, once we had those (he reckoned) along with a kernel -- GNU would be "done". That was about the extent of his "big picture" for project management.

Mainly, he concentrated on advocating for the idea of software freedom. I think the gambit was that if enough people demand their freedom, the project of organizing a GNU project would become easier. I don't think this gambit worked.

That was never a clear enough, coherent enough, or informed enough vision of the complete GNU project and, consequently, GNU has never really successfully gelled. You can grab some "100% libre" distributions, these days, but only barely. There is no sustainable culture and technical organization there ("yet", I hope).

The RMS failure I see is a failure at being a community organizer of GNU programmers. A lot of people got the vague idea of a GNU project. Many of us were happily recruited to the goal. But everyone I worked with at the FSF, including me, kind of went off in various incoherent directions -- doing what we guessed would help and that seemed interesting to us. We never "pulled together as a team" and, in the GNU project, that still doesn't happen.

The GNU project gradually accumulated a heck of a lot of very good "parts" but could never gel. The first three world-changing releases (GDB, GCC, and Emacs) really startled people. The various shell/text utilities in those early days spread because they were often usefully a little bit better than the proprietary "native" equivalents shipped by Sun, Dec, AT&T, etc. People sat up and took notice but behind the scenes the project of setting up a lasting "complete OS" project that would promote software freedom for all users ... never quite came together.

The "open source" people -- who I also later worked for, because I made a mistake in trusting them at their personal word to me -- seemed at first like they might help bring resources to the problem. In fact, what they mostly concentrated on was creating proprietary products using the free software "parts" from the incomplete GNU project. In the early days they sought to monopolize some of the key labor for the GNU project (and they succeeded, because they paid much better than RMS and many of those particular hackers didn't really give a shit about the freedom of users). As the "open source" industry matured it perfected its model of a perpetually incomplete / inadequate free software OS as a source of inspiration to enthusiastic youngsters, realized in practioce as a perpetually freedom-denying set of proprietary OS products. Companies like Red Hat and Canonical realized that they could exploit the deficit of community organizing to charge high rents for libre software, so long as they don't care seriously about the freedom of users. That's what they did and what they do.

So in my view, RMS was not good (and still is not good) at leading the GNU project -- but the real tragedy is brought on by the glad-handing, deep-pocketed, "open source" rentiers who place concern for their own profit above the freedom of the community.

belorn 23 hours ago 1 reply      
I find the "rant" and the linked post by Nikos Mavrogiannopoulos sadly missing any direct details over what the actual issues are. From reading it, one could get the idea that all the problem stems from GNU coding standard being archaic and not updated for modern programming.

Okey, I am skipping all the rant about FSF not funding software projects to pay developers, or that the GNU brand is not "hot", but both feels a bit silly. The hotness of a brand is transient, and in reality, only a handful number of brands inspires users and developers. I can't see how GNU would be more or less hot than say Gnome, KDE, or apache which each has a large number of projects under them. As for funding, since when did any of those organizations actually fund the projects? They role is provide help in setting up funding systems, help with tax declarations and provide further legal help.

Thankfully, the last link in the end (http://lwn.net/SubscriberLink/529522/854aed3fb6398b79/) looks to bring some light of what the actually issues really are: copyright assignments being US only, who the "owner" of a community project is, Nikos' feeling that he aren't getting any tangible benefits from being under the name GNU, and last a request for more transparency in the GNU projects decision process.

As for those reasons, there are two I agree with and two I don't. Firstly, Copyright assignments being US only is bad and shows an inflexibility a non-profit foundation should not have. Their role is to help projects, and thus should be as flexible as possible and thus provide equal possibility to assign copyright to US or EU. Second, as for who the owner of a community project is, the answer should stare the developers in the face. It should always be the community (developers and users) that "own" the project and decides its fate. If Nikos' announcement had included a decision by the community (preferable in a transparent manner), it would had been hard for GNU to object. Third, Nikos' feeling that he aren't getting any tangible benefits from GNU are his to have, but legal assistance is something many projects value. If a project has no need for legal assistance, no need for help in creating donation systems, and don't feel a threat about lawsuits against individual developers, then a foundation such as GNU, Apache or other similar organization are not going to give much tangible benefits. Fourth, in regard to more transparency in the GNU projects decision process, I can only agree with Nikos. The corner pillar in a community is transparency, and GNU should be fully aware of this. If there are discontent growing because of an lack of transparency, it should be addressed and fixed with high priority.

chernevik 19 hours ago 1 reply      
All disagreements aside, I'm deeply grateful for the efforts made to maintain and extend tools like sed.

I can't help see posts like this and worry about the perpetuation of open source, and wish I had the chops to do more to help.

As I write I'm downloading a Raspberry Pi image for my son's hardware. I'm getting him an Arduino, a soldering iron and a book for Christmas. I'm looking forward to learning along with him. I don't claim to understand the particular flows of code or inspiration, but I don't see how those projects happen without open source.

I also don't see how the Pi happens without industrial scale chip production. As I understand the matter, the Pi was developed by Broadcom staff on their own or 20% time, and its production occurs on interstitial time on production lines that could never be justified by a $25 SOIC. Pi is basically a cheap add-on to a massive industrial base.

Of course one point of vision is describing a realizable potential not apparent to the rest of us. But vision can and does proceed despite deviations from its perfect realization -- and sometimes is corrected by those deviations. I deeply disagree with RMS' politics, I'm deeply grateful for his technical contributions. I hope the community can always find a way forward.

juddlyon 23 hours ago 1 reply      
I wouldn't characterize that as a rant, Rails is a Ghetto was a rant. This was more of a reasoned venting. Too bad though, the guy seems like he's poured a lot of himself into these projects (from an outsider looking in).
MatthewPhillips 1 day ago 5 replies      
Why is Stallman still BDFL if he hasn't contributed a meaningful amount of code in years? Let him be the spokesperson so he gets the attention he desperately needs and leave the coding standards to people who code.
dfc 1 day ago 1 reply      
I have been a FSF supporter for a very long time. That being said I have never understood why the gnu-prog-discuss mailing list is so secretive. I can understand having restrictions on posting but I have bever heard a good argument for keeping the discussions behind closed doors. I do not think SPI has any cabalistic mailing lists.
pestaa 1 day ago 1 reply      
I made a few vague observations based on comments scattered around the web and this rant just made me want to write them down.

* GNU leadership seemed very stubborn from the beginning.

* GNU software is really great.

* Gnome is the new GNU.

I wish they wouldn't lose more momentum or the wide variety of software they write and maintain will suffer, too.

cmccabe 20 hours ago 0 replies      
I wonder if his copyright assignment agreement also covered the assignment of trademarks. The name of the project, which seems to be the thing under dispute, would certainly fall under that umbrella rather than copyright law. On the other hand, GNU might have a pretty strong case that including the word "GNU" in the name without being actually affiliated with them would be misleading.

I have to say, cases like this really point out the flaws in copyright assignment. It just doesn't make sense from a developer's perspective. If you put in the work to create the code, why would you allow someone else to control the licensing and the name? With proprietary software, the reason is clear-- in exchange for money. But with open source or free software, you really have nothing to gain from copyright assignment, and a lot to lose.

If you disagree with whatever the GPLv4 ends up being (or v5, or v6...), your only option is to fork the codebase and choose a new name. Experience has shown that renaming the project loses most of the userbase (think OpenOffice vs. LibreOffice.) This just isn't right. Developers should have a say in how their code is used-- they should be consulted when the code is going to be relicensed.

marcoamorales 1 day ago 2 replies      
As someone who shares beliefs with FSF's ideals, reading this rant makes me think that maybe there's a chance to bring up another movement with the relative same ideals as Free Software but have a different type of leadership.
nnq 12 hours ago 1 reply      
> It is likely not possible to convince a diverse group such
as the group of GNU maintainers to agree on coding standards for C++

...bluntly asking: why? (In any closed-source C++ project, if someone writes a "style guide and coding standard" thing and the project manager supports it, people start writing "compliant" code, grunting or moaning at first but they do, and then it becomes part of "company culture" and people find it natural to write code by it - I believe with Google's C++ was like this too... why does it has to be harder for an open source project?)

vsbuffalo 1 day ago 2 replies      
I really like the elephant and gazelle argument. I am a huge emacs proponent and I love using it, but I feel like it need to be forked and gutted. The whole beauty of an extensible editor is that extensions should be optional. Including more shit each release is not justifiable.
peripetylabs 1 day ago 2 replies      
Copyright assignment is impractical, and a great way to eliminate outside contributions to a project.
piqufoh 1 day ago 1 reply      
If GNU BDFL is not supported by the community, and he wields his power unwisely, then maybe it is time for a fork - GNOME?
ohnoohno 16 hours ago 1 reply      
My view: sed does not need to be "extended" nor should it require much maintenance. At least, the BSD sed's I use have not needed much work. I recall Brian Kernighan mentioning how little maintenance awk has required over the years. As such, I fail to see why changing maintainers is newsworthy. Perhaps someone was looking for an excuse to state their opinions on other matters?

I'll be honest I could not understand what Mr Bonzini is trying to say anymore than I could understand Mr Stallman's antics in the recent YouTube clip. With all due respect, what are these people on about? What is the problem? Clearly and succinctly, please.

Investors chriszacharias.com
123 points by niyazpk  16 hours ago   35 comments top 10
pg 14 hours ago 10 replies      
"They had been completely priced out!"

This is a fallacy. People use this term "priced out" as if it meant some sort of process, but it means nothing more than that the investor thought the startup's stock was too expensive. And it is very stupid to let valuation decide which startups you invest in, because the variation in outcomes between startups is orders of magnitude greater than the variation in valuations. I.e. there is no value investing in startups.

What we have here is a case of anecdotal evidence. A founder happened to get some investors who hadn't invested in other startups because they felt the valuations were too high, and those investors turned out to be really helpful. But there are other investors who are willing to invest at high valuations who are helpful, and investors who seek out low valuations who aren't.

scottkduncan 14 hours ago 2 replies      
I think this fits into the larger conversation around what terms to take in a funding round. The anecdote about all the YC companies bragging about valuation post-Demo Day rings true - but what else did they agree to in getting that high valuation? Ensuring that those who have partial control of your company and your future have interests aligned with yours does seem much more valuable than wringing every dollar out of a cap rate.
cududa 12 hours ago 0 replies      
This is my first company, but I've found that by having a lower valuation cap on my note two great things have happened. Some of my more powerful investors have felt comfortable introducing me to people who don't give two shits about the latest consumer web fad or investment trends, found my terms very easy to buy in on, and have been enormously helpful. Second, a lot of my friends have found they couldn't recapitalize or raise an A. Having the terms that I do, it was pretty easy for my investors to re-up when I needed it.
tomasien 12 hours ago 0 replies      
My company wasn't able to raise much money, but the money we did raise was from people who, luckily, have turned out to be amazing. We've gone through some really terrible things (my submissions will point you to some details), but they've stood by us 100%. Even though we had to shut down 3 months ago, these guys are still trying to make connections for us that would allow us to restart. You can't put a price on that, especially with the importance of not dying.

Everything is going to go wrong: optimize for having people around you that are going to help you out of THOSE times, because when it's going well help will chase you down. When it's going bad, you better have backup.

arbuge 5 hours ago 0 replies      
"Think about it. With too high of a cap or valuation, what incentive does an investor have to go to work on your behalf in the short term when the real return on their investment requires several orders of magnitude of growth, which has a very low likelihood of happening ever?"

It seems to me that if the angel investors in question are really able and willing to do productive work, you could get a similar result by simply paying them additional equity as an incentive after allowing the market to set the company valuation in the natural way. In other words, same as you would for any other early stage employee. No need to artificially interfere with the valuation and set it low to attract them.

dainix 10 hours ago 0 replies      
Not sure if I believe boot strapping because of my background and location Latvia, but it seems crazy people trying to raise millions of dollars, like their lifes would depend from it!! How could possibly you need money like that to launch a successful project???

I understand in USA all the hires are much more expensive, but why not aim towards bootstrapping if you have some money, hire cheaper Philippinos to help out at starters, and then go all shiny and hustla!!

diziet 15 hours ago 0 replies      
I like the concept of the investors having some skin in the game. One way might be to have them invest a lot -- and have a lot to lose, in addition to having a lot to gain from a smaller valuation. I also wonder if a smaller valuation makes a company more hungry and more driven to innovate and work harder than a company that knows they can throw money at a problem and 'solve it'.
eps 12 hours ago 1 reply      
Chris, if you are reading this, how did you find your investors? Say, that specific person you met in NY layover.
CurtMonash 5 hours ago 0 replies      
"No price is too high" is always an incorrect statement, at least in the context of investing.
infoseckid 14 hours ago 1 reply      
Another post on celebrating "I got some moneys" and yay! this time without YC, even though I was in YC.

Guys, get over it. Can some of you please post some inspirational articles on how you created a bootstrapped company?

How to Give a Great Presentation: Timeless Advice from a Legendary Adman, 1981 brainpickings.org
14 points by Brajeshwar  4 hours ago   discuss
3D-print records for your turntable instructables.com
30 points by alternize  7 hours ago   2 comments top 2
lostlogin 2 hours ago 0 replies      
I wonder if it would have higher audio resolution if phonograph cylinders were the recording holding device. http://en.wikipedia.org/wiki/Phonograph_cylinder
dexter313 6 hours ago 0 replies      
Previous discussion:

It sounds pretty good, similar to an old tape recording.

Lightbank Aims To Change VC Game, Will Invest Beyond Chicago techcrunch.com
4 points by iProject  1 hour ago   discuss
Fear Not Deflation forbes.com
3 points by secondChrome  1 hour ago   discuss
New journal of computational linguistics appears, encourages CC-licensed content ipipan.waw.pl
24 points by nathell  8 hours ago   2 comments top
PeterisP 6 hours ago 1 reply      
The indexing issue isn't described, and it's quite important. For example, my research needs to be published somewhere that's indexed in Thomson Reuters Web of Science or SCOPUS [even though some good journals of my subfield aren't there] - otherwise the results don't count in most practical measurements that determine the evaluation and funding of me and my institute.

Me and my colleagues can't publish in nonindexed (or weakly indexed) public journals, since you won't be able to publish the same research results in a 'good' journal or conference later (it's no longer 'original, unpublished research') - in essence, publishing here would mean throwing away many months of work, since the work itself and its citations will be disregarded.

       cached 23 December 2012 20:02:02 GMT