hacker news with inline top comments    .. more ..    26 Sep 2011 News
home   ask   best   6 years ago   
Netflix will stream Dreamworks nytimes.com
45 points by allending  1 hour ago   8 comments top 5
kloncks 20 minutes ago 1 reply      
Netflix will begin streaming DreamWorks films starting in 2013.

Lovely. See you in two years.

swanson 36 minutes ago 0 replies      
Got excited for new content on Netflix Instant -- then read further and saw it wasn't taking effect until 2013...I am now less excited.
icarus_drowning 37 minutes ago 0 replies      
Well, it certainly is nice to see that Netflix's promise of securing higher-quality content is at least partially coming to fruition.

Also, it is nice to see a content creator finally start talking seriously about internet streaming versus traditional cable as a means of content delivery.

antimora 32 minutes ago 0 replies      
I am still waiting when I can play Netflix on Linux.
mitjak 32 minutes ago 1 reply      
DreamWorks Animation, not DreamWorks. That's a much smaller subset of the DreamWorks catalogue. Link bait.
Four things I learned on a round-the-world yacht race tonyhaile.com
84 points by arctictony  4 hours ago   9 comments top 4
logjam 1 hour ago 0 replies      
Blowing his horn and chasing Cape Horn dreams, Harry [Mitchell] sails out of Charleston on September 17 1994. 'For the rest of your life don't waste any time. Make the best of what you may before you turn into clay,' he told students before he left Sydney for the Southern Ocean.

(Harry Mitchell, age 70, was lost at sea in the Southern Ocean in the 1994 BOC solo around the world yacht race).

-- Paul Gelder, The Loneliest Race

I began to understand the struggle and the despair in the simply written ships' journals, in the monochrome prose that could suddenly bloom with feeling:

March 29, 1913: Terrible heavy NW gale. Lost mizzen upper topsail and main lower top gallant sail. Got two men hurt. All hands on deck all night.

30th. 6 AM: quick shift from NW to SW with hurricane force, with terrible heavy cross sea. Ship under two lower topsails and under water. Lost outer jib. Washed off the boom.

31st: wind SW. Very heavy gale.

April 1st: terrible heavy WNW gale, ship under two lower topsails and drifting to the eastward, and my heart is broken under these heavy gales all the time.

So reads the log of the Edward Sewell, 263 days out from Philadelphia to Honolulu, battering to westward in the grip of The Horn for 67 days.

-- David and Dan Hayes, My Old Man and The Sea

mikedmiked 1 hour ago 1 reply      
Once in a while I am blessed enough to come across a post like this. It made me think a lot about my life and who I wish to become. I've been working a "normal job" and just forgot that it is possible to do something like this.
sliverstorm 1 hour ago 3 replies      
Can anyone comment on how doable it is for an ordinary Joe to get involved in something like this?
kposehn 2 hours ago 0 replies      
Fantastic post, very good to see someone's similar experiences. I haven't sailed around the world, but been through things that taught me the same; I'm glad for it every day.
How Bad Boards Kill Companies: HP mondaynote.com
44 points by alexandros  3 hours ago   5 comments top 4
_delirium 1 hour ago 0 replies      
Boards are really not set up to do any sort of serious monitoring of a complex multinational corporation, especially if they're supposed to be doing it independently of the executives they immediately supervise (CEO/CTO/COO/etc.).

Nearly every major company's board is full of people who have more-than-full-time day jobs, plus sit on 3-5 other boards. There is really no way they can do a proper job in that situation; in practice, they devote one day a month to doing the bare minimum to fulfill the responsibilities of each board position they hold, if that.

For example, after a story about Cisco acting badly (http://news.ycombinator.com/item?id=2789540), I was motivated to look at who was on Cisco's board who might exercise some oversight. What I found didn't seem likely to be a group of people spending much time on the matter. One of the board members is the President of Stanford University, which I assume is a pretty busy job to begin with. Another is the now-former CEO of Yahoo mentioned in this piece. One more is the CEO of Mercer, the world's largest H.R. consulting firm. How much time do you think these people each take out of their day jobs to really understand how Cisco operates? I would wager very close to zero.

So this seems (to me) much more of a structural problem with how large companies are run, than an HP-specific problem. They may have gotten particularly unlucky roll of the dice, but this style of corporate governance isn't set up to favor good outcomes.

linuxhansl 2 hours ago 0 replies      
A friend of mine works in the publishing business. He equated the previous few CEOs of his (large) company to locusts.

They move through, cut cost by laying of people. That leads to great profit increases for a quarter or two, after which the lack of qualified employees shows its toll.

By the time that problems become apparent the CEO collected a huge bonus and moved on, leaving the mess for somebody else to cleanup.

ilamont 1 hour ago 0 replies      
The author is correct to blast the board, but the claim that Whitman "has zero background in tech" suggests that she comes to the HP job as an old-school industrialist. Starting in 1998, she spent 10 years growing eBay, which required an understanding of market dynamics and making some smart technology hires and acquisitions (including PayPal) along the way.
ShawnJG 2 hours ago 1 reply      
this is a perfect example of when checks and balances go wrong. While CEOs and boards may not see eye to eye on every decision they sometimes forget that they are not enemies. They should be working together toward common productive goals. Even though these are billion-dollar companies, they cannot afford to continue to destroy themselves from within, sooner or later profits will dwindle and they will be no more. Mistakes are bound to happen. But according to this article, HP doesn't seem to learn from theirs. They are thoroughly living the definition of insanity as if it were a religion!

I hate to bring politics into this, but regardless of whatever side you're on Congress and the president seem to be doing the same thing. Although they're supposed to watch each other. They forgotten that they're on the same side. It should be working together to get things done. Internal fighting and derision can kill governments, as well as companies.

Stanford's online Machine Learning class now open for enrollment ml-class.org
99 points by roger_lee  5 hours ago   17 comments top 8
amirmc 3 hours ago 0 replies      
Just as a reminder for folks, there's a spreadsheet of HN readers who are taking part in the classes at http://bit.ly/pLCRzg

There are 120+ HNers on there so you might find folks nearby you'd like to get in touch with.

Edit: Obviously, if you're doing any of the courses feel free to add your details too

vl 2 hours ago 2 replies      
I wonder if taking both ai-class.org and ml-class.org is going to be too much load for a person who is working full-time and if they are going to cover a lot of the same topics or not? Does anyone have an insight?
younata 10 minutes ago 0 replies      

It'll be interesting to see how this and the ai-class will affect my performance in my other classes.

gnok 2 hours ago 1 reply      
Somewhat disappointed after realizing that the videos are all in Flash. So I can't watch this on my iPad. Does anyone know if these exact videos are also available on iTunes U?
Sargis 4 hours ago 2 replies      
How difficult would this be for someone with minimal linear algebra knowledge?
Tycho 3 hours ago 2 replies      
Am I likely to learn anything of immediate practical benefit on this course? I'm not a computer scientist, just someone who writes lots of scripts to work with Unix/Excel/RDBMS/XML for a finance company.

(not that this would discourage me from taking the course, I'm just curious whether it will give me some new tools on my swiss-army-knife of programming knowledge, for my day job)

rcavezza 3 hours ago 0 replies      
I have a business background and I taught myself to code after getting involved in startups. I'm in the middle of deciding whether it is worth it to go back to school and get a master's degree in computer science. I hope this course will help me make that decision.
mattdeboard 5 hours ago 0 replies      
Just registered for this and looking forward to getting rolling.
Manufacturing "A Minimal Pen" in China, failures and successes kickstarter.com
71 points by rizumu  5 hours ago   41 comments top 12
ams6110 2 hours ago 1 reply      
Wonder if they even tried to find a manufacturer in the USA. Small volume, and technically very simple. No language, time zone, import, or travel barriers.
megrimlock 3 hours ago 1 reply      
For anyone who found this tale interesting, "Poorly Made in China" is a well-written and eye-opening account of similar antics.


Much like the kickstarter project mentions, this book covers repeatedly broken promises, misleading claims (like the laser etching that turned out to be CNC), continuous reassurances followed by convenient disappearances, completely mythical factory sites and machinery, supposedly mechanical processes that turn out to be skilled hand-labor, and the need for vigilant and cynical quality control. It makes a convincing argument that once you account for the cost of all these shenanigans, export manufacturing is nowhere near as good a deal as it seems.

zokier 2 hours ago 0 replies      
Reminds me of what Jeri Ellsworth told about making electronics in China[1]: "After getting to Hong Kong I opened one of the units to find that they had cost reduced my reference design without telling me. "

[1] http://www.eeweb.com/spotlight/interview-with-jeri-ellsworth

albahk 2 hours ago 1 reply      
I'm amused at the mindset of trying to do low-volume and hence (for the factory) low-profit manufacturing runs in China and complaining when it turns out to be more difficult than uploading a file and waiting for a box of Apple-quality products to arrive at your front door.

Simple economics - if it was easy, everyone could do it.

I work on construction projects in China so I have an idea of the difficulties re: quality and expectations

naner 4 hours ago 2 replies      
Pen Type-A is a stainless steel replacement for the Hi-Tec-C's cheap plastic housing. To us, the Hi-Tec-C cartridges deserve a more durable home.

This is kind of funny, I sort of did the same thing -- though my solution was much cheaper. I write a ton with ink and I had a hell of a time finding a pen that wrote smoothly, was gel, was rectractible (pen tops are a PITA), didn't smear, gave consistent lines, and was available at office stores for an affordable price. Well, the Pentel Energel pens met these requirements but the pen casing is flimsy and fat and very cheap feeling. So I just buy the refills and put them in a Sarasa SE pen casing. My only gripe is I would prefer a finer point. The Pentels only go to 0.7mm and I'd like at least 0.5mm, but this is good enough. Every gel pen I've used that size has problems (scratchy, suffers roller ball blowouts, etc.) so maybe good gel roller balls at that size present engineering problems.

Oh, and I also tried that Mont Blanc "hack" where you modify the refill cartridge so it fits in a cheapo Pilot G2 casing. I'm not willing to regularly pay $10-$15 for 2 ink refills, I just wanted to see what the fuss was all about. The writing was mediocre at best. Most noticeable was the problems I had with skipping and inconsistency. The value of Mont Blanc pens certainly does not come from the ink cartridges.

hugh3 4 hours ago 1 reply      
I assume there was some context in previous posts in this series, but I have no idea what's going on here.

Why are these guys trying to build a pen? What's a "minimal" pen? What's wrong with a fifty-cent Bic?

fishtoaster 4 hours ago 1 reply      
Quick Context link: http://www.kickstarter.com/projects/cwandt/pen-type-a-a-mini...

Short version:
They want to take the venerable Hi-Tec-C pen and put it in a steel body with a steel ruler sleeve. It will retail for $99.

2muchcoffeeman 4 hours ago 0 replies      
"The factory owner tried to tell us that in China laser etching is synonymous with CNC milling.  We were not amused."


abcd_f 2 hours ago 0 replies      
codecaine 3 hours ago 2 replies      
did they actually receive $281,989 in funding or is that a bug on kickstarters side?
Wingman4l7 1 hour ago 0 replies      
This situation instantly reminded me of this recent Wired piece: http://www.wired.com/magazine/2011/02/ff_madeinamerica/all/1 Made in America: Small Businesses Buck the Offshoring Trend
rizumu 2 hours ago 0 replies      
Another highly regarded pen is the Pentel Stylo MLJ20: http://www.cultpens.com/acatalog/Pentel_Tradio_Refills.html

The images from an artist who works with it: http://www.residentadvisor.net/feature.aspx?1332

How to define custom, colored labels (like TODO) in VIM github.com
13 points by pabloIMO  1 hour ago   discuss
Notes From the MIT Startup Bootcamp 2011 jayunit.net
27 points by ubuwaits  3 hours ago   discuss
Why All Employees Should Be VIPs At Your Company holler.com
111 points by biznickman  7 hours ago   29 comments top 11
rmason 5 hours ago 1 reply      
It appears to me from afar that Facebook has forgotten one of the most important rules of the Valley. When you start ignoring the contributions of developers the better ones leave.


butterfi 5 hours ago 2 replies      
How much of this kind of thing is a by-product of marketers as well? I've seen more then a few events where the people who actually did the development were excluded to accommodate business partners and clients. It's not that marketing necessarily wants to exclude developers, but space and resources are limited, and socializing is an important tool in the marketer tool kit. Not that I condone this (having been on both sides of this equation), but it does speak to the shifting sands of prioritizing the goals of these kinds of events (is it a company celebration to thank employees, or a marketing event?)
_delirium 6 hours ago 0 replies      
There are quite a few hacker events with more of a flat-hierarchy, open-to-all-comers ethos, but yeah, you probably won't find them attached to these more high-profile media events (there are exclusive sponsored after-parties associated with events like WWDC and GDC as well, to add more examples). There's fortunately the whole other parallel world of stuff like Noisebridge, DevHouse, etc...
rektide 5 hours ago 1 reply      
Small non-exclusive company rails against big companies who shield their developers / have become exclusive, news at 7.

I'm all for companies that can keep developers and engineers up at the front, being deeply technical involved and interacting with their communities. But that's not going to be all companies, and Facebook in particular is one of those companies that is by their nature a closed system that builds itself not out in public (ala, say Mozilla), but through the proxy of press releases and new features dropped onto users' and 3rd partys' laps.

The other big factor I'd pitch would be consistency: how much technical content is there, and what amount is appropriate for a developer to be showing off? If the event is not inherently a technical event, a fifteen or thirty minute segment from a developer might seem really out of place. Developers are to be respected, but Facebook isn't a technical company, it's a networking company.

gavanwoolery 4 hours ago 1 reply      
I have never had any desire to go to any conference, ever. What I tend to find is that talking gets very little done, and often not much is learned beyond a few truisms. If I want to talk to people I do it online, it is just more time-effective. At a conference, I don't necessarily know who a given person is, but online I have full access to their information (typically), and I only talk to the people I want to.

If I want to get into a good party (which is seldom), I go to whatever club is the current flavor of the month and shove a filthy wad of cash in the bouncer's hand. I could give two shits about an "exclusive" SV party, and I would hope most Facebook employees feel the same way.

sliverstorm 5 hours ago 1 reply      
"Why All Employees Should Be VIPs At Your Company"

Been visiting Lake Wobegon recently I see.

reso 5 hours ago 1 reply      
This guy has obviously never worked for Facebook, and so has no idea how Facebook treats its engineers. Not getting invited to a particular f8 afterparty (which sounds like it was Spotify's deal and not Facebook's), is peanuts compared to the level of respect and power they are given within the company every day.
fleitz 6 hours ago 1 reply      
The reality of the situation is that a party with 10, 100, 1000, 10000, and 100000 people feel completely different. With social dynamics being what they are it's difficult to invite everyone and still retain the atmosphere sought by the planner. I have no insight into the mindsets of the planners but I have a feeling the situation is probably the result of the realities of party planning rather than a desire to exclude certain individuals, or a class of individuals.

The situation with the B-list party could probably have been handled better with regard to musical talent and an obvious discrepancy between the talent invited.

dorkitude 6 hours ago 1 reply      
Facebook hardly represents all of Bay Area tech.

And it certainly doesn't represent startups (at all).

codecaine 6 hours ago 2 replies      
I've never been to Silicon valley, in fact I've never been to the US but I personally feel that this picture the author draws of "old" SV is very idealized. This rant contains high amounts of nostalgia and therefore does not seem to be entirely rational. I would love to know if SV used to be like O'Neill describes it in his article.
thatsiebguy 5 hours ago 0 replies      
I skimmed the headline at first and thought it said "How The "IT Crowd" Hijacked Silicon Valley". Boy was I in for a disappointment.
LaTeX tricks to make madman Cthulhu worshipper -like text stackexchange.com
72 points by Swizec  6 hours ago   7 comments top 3
sneak 1 hour ago 1 reply      
My girlfriend wrote a paper this year about constructed languages and the people who create and use them, and typeset the title in elvish. Did you know that there are not one or two elvish typefaces for LaTeX, but in fact seven or eight?

Despite my abiding love of capitalism and enterprise, it brought me great joy to know that there are hackers out there still slaving away on the entirely esoteric and not 100% of us have yet caught the "everything should be about revenue" bug.

mhd 1 hour ago 0 replies      
I'm not so happy with the result. The parchment is too three-dimensional, which clashes with the font. I'd recommend regular (or calligraphy) paper (maybe printed with a non-white background, if your printer is good enough), and then "aging" it. Tea or coffee is usually a good method, although simple crumpling might suffice.

Also, either for a medieval source or a 1920s document, block lettering seems a bit off.

But this really shows what's possible if you have programmatic control over your typesetting. I really should do some (La)TeX again…

tikhonj 2 hours ago 1 reply      
One thing that seems to be missing from the answers is any sort of (pseudo)random modifications to the resulting text.

Can this sort of thing be done with plain LaTeX? I've seen trivial examples of randomness using LuaTeX; would that be a practical way to add some randomness to the typeset text?

"When I design software, I really try hard to make it simple." mail-archive.com
114 points by dchest  9 hours ago   39 comments top 12
ot 6 hours ago 2 replies      
People really mean different things when they say "simple".

For someone it may be "the abstraction is so clean, small and self-contained that I don't have to look at the implementation", for others "the implementation is so clean, small and self-contained that I don't need any abstraction". The author of this post clearly subscribes to the latter philosophy.

I guess many flamewars on mailing lists ultimately boil down to this culture clash.

arethuza 6 hours ago 0 replies      
One of my favourite quotes from C.A.R. Hoare seems relevant:

"There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature."

_delirium 7 hours ago 3 replies      
It's kind of interesting that his approach, using essentially 1980s-era scaling technology (inetd forking a server per request, plus old-style CGI for dynamic content) has no trouble at all running these high-traffic websites on a tiny VPS.

edit: Though come to think of it it shouldn't be that surprising. If that architecture managed to work at all on 80s hardware, it should scream on anything from 2011, even a VPS.

acqq 5 hours ago 0 replies      
In case somebody doesn't know D. Richard Hipp not only wrote SQLite, the SQL engine used in iPhone, Chrome browser and who knows where else, he also put that in public domain.

Moreover, SQLite increases the executable only around 260 KB. That was one of the design goals!

chubot 5 hours ago 2 replies      
This is awesome -- a wise man.

However he is only able to do this because he writes in C. Python and Ruby are not slow, but they have terrible startup time, which causes a huge amount of overhead for CGI. They probably do 1,000,000x the work of a C program before getting to main(). I bet he statically links his binaries (or at least has few dependencies on share objects) because that has a pretty big cost too.

I wonder if he writing the CGIs in Lua would have the same efficiency. In Lua you would pretty much have to fork because it doesn't have threads and the interpreter is not thread-safe (no global interpreter lock). Or maybe v8 would work too.

wrs 4 hours ago 1 reply      
I love small and simple, I love self-contained. But then I got to GetMimeType, where he wrote his own binary search code instead of calling the C library to do it.

Though it sounds simple, binary search is a minefield of edge cases and I've seen textbooks that do it wrong.

So when I read the GetMimeType function, I didn't think "how nice and simple"...I thought "Hmm, I bet that doesn't work."

treo 7 hours ago 1 reply      
Just in case that anybody cares: There are more or less simple ways to use cgi with nginx. Just see here: http://wiki.nginx.org/Fcgiwrap or http://wiki.nginx.org/SimpleCGI
rednaught 7 hours ago 1 reply      
For those who do use inetd, you might be interested in xinetd which has some access controls. I've been using this for years as well for very simple apps and it "just works."


malux85 5 hours ago 0 replies      
My mentor once said to me "Don't show me how clever you are by writing complex software. Show me how clever you are by making complex software simple"

Those words echo in my head daily

pestaa 5 hours ago 0 replies      
This is the exact page I was trying to find for days! I thought Mr. Hipp posted this somewhere on sqlite.org... Thank you dchest so much!
tszming 7 hours ago 0 replies      
I was not aware of the fact that Fossil SCM & SQLite are from the same guy..
Luyt 5 hours ago 0 replies      
FLOSS Weekly interview with Richard Hipp:


"SQLite, an in-process library that implements a self-contained, serverless, zero-configuration, transactional SQL database engine."

Enrollment for Stanford's online DB class now open db-class.org
111 points by roger_lee  9 hours ago   25 comments top 15
pbh 2 hours ago 1 reply      
To answer a few of the database questions that keep popping up in this thread:

1. Is it worth understanding relational algebra? Yes. Definitely. It's not that hard, there's more or less a one-to-one mapping of relational algebra to SQL. Relational algebra is both a useful mathematical tool and indispensable when trying to understand query optimization. Relational algebra is also necessary to be able to read pretty much any of the database literature, if that's one of your goals. (Just be thankful you don't have to learn Relational Calculus or Datalog.)

2. Is it worth learning XML features like XML DTDs, XPath, XML Schema, and XQuery? Yes, definitely. I don't like XML, but there are tons of places where you can use XPath. (The most interesting thing about XQuery is probably how similar it is to SQL, though.)

3. Is it worth learning about SQL triggers? Probably. Ultimately, the web development world is split between people for whom separation of concerns means doing more in the database, and those for whom separation of concerns means doing less in the database. If you think doing validation and transactions in the database is sensible, triggers are also a sensible thing to learn about. However, people like DHH disagree and want validation in your app server with a dumber backend.

4. Is this vocational training? Well, this is the first in a three class sequence at Stanford. This class is mostly about schema and query languages, so in some sense, it's the most useful to practitioners (or at least, non-DBA practitioners). The second class is about systems and implementation (on-disk layout, indexing structures, query optimization). In the third class, students either learn about distributed databases or actually build their own from scratch. However, schema and query languages have enough theory behind them, and enough generality, that I wouldn't consider it any more vocational training than a class on compilers.

skymt 8 hours ago 1 reply      
jules 3 hours ago 0 replies      
I watched the first 6 videos and I have to say it is a bit boring. It feels more like vocational education on using man made contraptions than a university course. I was hoping for a course on how databases work: indexing structures (B-trees, COLAs), query optimizers, transactions, column vs row storage, etc.
amirmc 3 hours ago 0 replies      
Just as a reminder for folks, there's a spreadsheet of HN readers who are taking part in the classes at http://bit.ly/pLCRzg

There are 120+ HNers on there so you might find folks nearby you'd like to get in touch with.

Edit: Obviously, if you're doing any of the courses feel free to add your details too

crag 3 hours ago 1 reply      
I love the course. But I'll admit, the XML section; I was fighting to stay awake.

And relational algebra? Com'on...

yequalsx 5 hours ago 0 replies      
I've tried searching for a list of Stanford's online open enrollment courses and haven't found anything. Is there a website that details their course offerings and a list of planned courses?
nphase 5 hours ago 0 replies      
Can I follow the advanced track at my own pace without being forced to participation? The advanced track of all of these classes seem to require participation, which I can't guarantee on their timeline right now.. (yay startup life!)
rmnoon 5 hours ago 1 reply      
I took this class from Prof. Widom at Stanford. It was...meh.

Are SQL triggers still relevant? How much relational algebra do you really need to know? I think I've used this stuff maybe once since then.

beaumartinez 7 hours ago 1 reply      
As I got told in the email as I enrolled""The class will run from October 10 to December 12".
eldina 5 hours ago 1 reply      
Is it clear whether these open courses will be repeated in the future or if they are mainly tests before starting to charge ?
It is wonderful that they are freely available but can it really be a permanent thing and what does Stanford gain ?
lambada 3 hours ago 0 replies      
Any idea why the AI class website looks so different to the ML and DB classes? It doesn't even use the same TLD, let alone have the same style.
grinnbearit 8 hours ago 0 replies      
It looks like the ML class is up too http://ml-class.org
burrokeet 7 hours ago 0 replies      
cool class even though I failed my EE qual session with Prof Widom :)
Kaedon 8 hours ago 0 replies      
Awesome, thank you for posting this.
rosariom 8 hours ago 0 replies      
Thanks man this is an ultra cool subject!
ORNL invention unravels mystery of protein folding ornl.gov
19 points by massim  3 hours ago   8 comments top 4
troymc 59 minutes ago 0 replies      
mynegation 1 hour ago 1 reply      
Short URL to the patent at USPTO: http://1.usa.gov/plfTg3

I am not a specialist in protein folding, so take it with a grain of salt, but from the figures it does not look like method predicts actual configurations decoded from experiments. Patent itself acknowledges that (5.50): "the lack of computational prediction [...] can be attributed to problems arising form calculating molecular mechanics potentials (force-fields)".

Could anybody knowledgeable make an assessment how important this invention is?

robryan 2 hours ago 1 reply      
Pretty low on detail, such as how dies this differ from the old method and what makes it so much more efficient exactly.
smoyer 3 hours ago 2 replies      
Why is there a patent on an algorithm that was developed using tax-payer money? Or maybe more importantly, why is it "available for licensing". Technically, I already own a small portion of the technique!
DevOps in Milliseconds appnexus.com
19 points by pedoh  3 hours ago   4 comments top 2
ams6110 48 minutes ago 0 replies      
If you're interested in this stuff, there is a "Camp DevOps" in Chicago next month.


Involves some of the same folks who put on the excellent Erlang Camp I attended last year, on the basis of that experience, I'd expect this to be a good value.

ww520 59 minutes ago 1 reply      
Speaking of deployment, anyone has a good suggestion for a simple deploying system in AWS? Ideally a generic EC2 instance is booted up with an apptype passed in, and based on the apptype it downloads the app code from S3 and starts up the app. The code is pushed to S3 by dev when it's ready. Is there such a system?
Editorial: Facebook's new sharing is anything but ‘frictionless' thisismynext.com
26 points by acak  4 hours ago   7 comments top 6
acak 2 hours ago 0 replies      
The part of the essay that stood out for me is this:

The panopticon was a building design dreamed up by the philosopher Jeremy Bentham, and in its most basic form, it's a prison scheme which allows observers (i.e., prison guards) to have a constant view of the inmates if they so desire, without the inmates knowing for sure if they are being watched. The effect, of course, is feeling that one is always being watched, resulting in altered (more “normal,” acceptable) behavior. Bentham's idea was, he said, applicable to poor houses, hospitals, schools, and mad houses " though he ultimately devoted his time to designing for prisons. The express purpose of the panopticon is behavior modification, what Bentham described as “a new mode of obtaining power of mind over mind, in a quantity hitherto without example.” No such prison was ever built to Bentham's specifications.

The author hit the nail on the head in drawing that comparison to Facebook.

I have long been unsure how the usage of such a version of Facebook would affect us. The answer might lie in the psychological effects of living in a Panopticon.



v21 2 hours ago 0 replies      
And at the bottom of the article is the call to "Like" it. And a button that has already told Facebook that I've read the article.

(Don't get me wrong, the web page has also told Google (via Plus and Google Analytics), Twitter, Chartbeat, and a bunch of ad platforms I was there. And a couple of CDN platforms, too, I guess. It's not just Facebook here.)

There's at least three points to be made here: There's a irritating design choice that doesn't let you easily specify a granular enough privacy setting (although this is weighed against the general pain of making choices). There's the fact that having a private or counterfeit or incomplete public online identity is getting ever harder these days. And there's the fact that large corporations are collecting ever more data on us, and this data is increasingly hard to evade.

r00fus 2 hours ago 1 reply      
I predict that Google+ won't win over Facebook in the classical sense, but force Facebook to reveal how exposed all of your content on Facebook actually is.

As Facebook embraces the "frictionless sharing" of Timeline and this auto-share (esp. for links clicked from other Facebook posts), they will eventually cross a line where users get dramatically less willing to share (as it's all on your behalf without your explicit command).

aphexairlines 2 hours ago 0 replies      
Rdio listing the tracks you listened to is maybe a bad example. Last.fm has been doing this for years without complaints.
ajg1977 3 hours ago 0 replies      
The way that Facebook termed this "Frictionless Sharing", along with the video game achievement image, was a marvelous piece of PR. Who doesn't like things to be easier, and who wouldn't object to their game achievements being automatically posted?

I predict it'll be a matter of weeks before people discover that they really do not want apps to transparently post all of their activities for the world (or even their 'friends') to see and there's a public outcry.

itswindy 3 hours ago 0 replies      
Why so many (negative about) FB stories in front? I guess since Google can't Pandafi Facebook, copy their model and drive them out of business or manually give them a penalty they do this. Even monopolies have their limits apparently.

Google employees: Why not go and add a few more dozen ads on pages, there's still more room. Make Larry happy!

Walk Away From Your Computer. Seriously. kcurtin.squarespace.com
85 points by kcurtin  8 hours ago   39 comments top 17
sophacles 6 hours ago 3 replies      
Tangential to the main point:

The OP just learned a valuable lesson in what computers actually do. The single char oops is usually programmers' first insight into what it really means when we say "computers do exactly what they are told nothing more or less". This is one of those things we all have had frustrate us to no end. Similar mistakes abound, and are one of the those things that cause a lot of people to give up on the programming thing all together. Here's a real kicker about it: they don't stop with experience, we just get better at noticing them faster.

For example about 2 years ago -- 5 years into a programming career and 12 years into programming in general, I had this code give me trouble:

   // C Code:
if (test_fails)

So I decided to throw a quick printf in there (the situation was such that learning the debugging tools available to me would take a couple weeks, so printf was my quick solution here). I should note that I hade been working heavily in python for the preceding 3 months. My code looked like this now:

   // C Code:
if (test_fails)
log("info about test");

Which caused my code to crash every time instead of sometimes. C coders will know that multi-statement conditionals need to be wrapped in { } but single statement ones don't. Further python if statements are indent scoped. The recent python work sort of made me blind to the problem in the code. After 3 days of WTFing, talking to the rubber duck, and so on, I called over a colleague and he pointed out the mistake in something like 40 seconds. A younger me would have been embarrased, but this just happens in programming.

My point is: these things happen, and I'm glad to see a blog post about this, without the demoralization that normally comes with it. I think it should be more openly discussed in tutorials and other newb oriented posts.

A few tips for anyone who is experiencing this sort of thing and not sure what to do:

Get a rubber duck (or whatever) and start employing the rubber-duck debugging technique: http://en.wikipedia.org/wiki/Rubber_duck_debugging it really helps!

Be willing to include colleagues early and often. Experienced programmers understand and won't think less of you. Doubly so if you are new. (The will tease you a bit about it, but in that shared experienced bonding way, not in the "mock the outsider" way).

Realize that as you are chasing the bug by getting down and dirty with other bits of the program, you aren't wasting time, you are most likely fixing other bugs that would have otherwise just manifested later anyway.

espeed 5 hours ago 0 replies      
He's right, the "walking away" trick does work wonders some time. It gets you to relax, take a step back, and see the big picture.

The other aspect to this is reliance on Google. This is part stood out to me...

  I start to panic a little bit...I turn to google hoping
for a quick fix and copy/paste one of my error messages
into the search box.

Unless Ruby/Rails error messages these days are completely anemic (I haven't worked with Rails in a while), then the line number and error message should provide enough clues to debug a misspelling without having to go to Google.

Trust your analytic abilities -- it will keep you from that "panic" state.

Instead of seeing an error and panicking, be like Stanley Moss in Wag the Dog -- "This is nothing" :) Eventually you'll get to a point like Paul Graham where debugging relaxes you:

  I like debugging: it's the one time that hacking is as
straightforward as people think it is. You have a totally
constrained problem, and all you have to do is solve it.
Your program is supposed to do x. Instead it does y. Where
does it go wrong? You know you're going to win in the end.
It's as relaxing as painting a wall.

Source: http://www.paulgraham.com/hp.html

Groxx 1 hour ago 0 replies      
Um, very true, but this would also be a sign that they need to read their stack traces / test outputs, and probably try a debugger.

Seriously. Debugger. For tests, it's as simple as `gem install rdebug` and put `debugger` in the failing test(s). When it stops on that line, type `eval instance_variables`, and viola - your array contains "@microposts" and not "@micropost". Running it against a full Rails application, especially with, say, Passenger, is a bit trickier, but it's still worth the initial effort.

wccrawford 7 hours ago 1 reply      
When I first started programming, I wasn't able to do this. Walking away meant I kept thinking about it. Trying to go to bed was impossible. I -needed- to solve the problem before I could sleep.

Now, if I get frustrated, I'm much more likely to walk away and do other things and come back with a fresh mind later. It has almost always been helpful.

nphase 6 hours ago 0 replies      
I have lost track the number of times where my coworkers and I will be pounding our heads on our desks for hours to find a solution to a problem, only to find the solution in the middle of a bathroom break.

Walking away is critical to any creative process.

sneak 1 hour ago 0 replies      
Two things to add:

1. As one of the article commenters pointed out: "git diff". Seriously.

2. One of the best things I ever did was to buy a weight bench/barbell. When I do my twice-an-hour "walk away for five minutes" routine, I go and do 20 bicep curls or bench presses. I solve problems faster and also improve my health/strength. Win/win!

reinhardt 1 hour ago 0 replies      
<half-trolling>Seems like that in this case the lesson should have been to use a statically typed language instead.</half-trolling>

(PS: dyed in the wool Python guy here)

parallel 1 hour ago 0 replies      
I think the key here is the low quality work at the end of the day. I often knock off when I detect that I'm running at a very low productivity rate. Invariably the next day the overwhelming and seemingly massive task from the night before takes just a few minutes for the well rested brain.

There's also something about the "self control as a finite resource" that comes in to play here.

FuzzyDunlop 7 hours ago 0 replies      
I'd say this wasn't counterintuitive nor particularly abnormal.

Sometimes you can, for whatever reason, get so close to something - a project, a situation, a problem, a case, etc. - that you get tunnel vision and fail to appreciate the bigger picture. That's just a product of focus, determination, and ambition, I think.

Once you step out of that tunnel and focus on something else, you can release the frustration you had before and return to the problem with a clear mind and even a totally fresh outlook.

The problem doesn't matter as much as your own wellbeing, so there's no point in stressing yourself out over one when there are million other things you could be doing. Such a laid back approach may allow you to be more productive, hence in that case it would be intuitive.

aorshan 1 hour ago 0 replies      
This is absolutely true. One of the most important lessons I ever learned in English class was to walk away. Always give yourself a day between when you write something and when you edit it. You come back fresh and with a new perspective.
Aviwein77 5 hours ago 0 replies      
I completely agree with this, countless times it has been variables with the slightly wrong name, or forgetting to upload on filezila, or one of many small mistakes that afterwards seem obvious but cause a lot of frustration.

One comment that I saw on the original thread was talking to people. I wanted to highlight this because it has helped me more times than anything else. Even explaining step by step what my program should do to my girlfriend, who has very little computer expertise in general and usually gives me a blank stare, can help a lot.

I remember finding an image on reddit a while back of a programmer laboring over his code for hours only to find a greater than which was supposed to be a less than sign, my thought was 'a mistake we will all learn again and again'.

dorkitude 7 hours ago 1 reply      
Absolutely valid.

When I'm doing something difficult, nothing is more important than giving my subconscious some time to work out a problem.

mnutt 6 hours ago 1 reply      
While solving problems on your own is a great way to learn things, another solution to this is to just talk to someone. They don't have to be an expert in the field, they just need to be there for you to confirm your base assumptions are sound. In fact, often the act of explaining the problem to them will force you to think through your problem differently, leading to the solution. Like Kent Beck said, "Once a problem is described in sufficient detail, its solution is obvious."
ocharles 5 hours ago 2 replies      
I, like the rest of the commenter's here, do certainly agree that a break can be all you need. However, I think the OP is missing a valuable lesson, which is that not only should they focus on getting the code to work they need to understand what was wrong and importantly - how to spot the symptoms that it was wrong. In this case, it was a misnamed variable. It's great that that's been fixed, but it's very easy to go hurtling on into the next problem. A few minutes of reading the error message again (and perhaps trying to cause it somewhere else in purpose) goes along way in building up a much more reactive way of working.
ch0wn 6 hours ago 0 replies      
This is such an important advice, it can't be repeated often enough. I can't even count the times making break helped me solve a problem in minutes that I tried before for hours.
hugh3 4 hours ago 1 reply      
Sometimes I feel compelled to take a one-year break from all computers.

But I know that if I did, I'd just feel compelled to blog about it.

TWSS 3 hours ago 0 replies      
As a UI designer, this is something I need to tattoo on my forearm. Sometimes I'll spend so long banging my head against a problem that my bike-racer colleague will take me aside and remind me of the principles of overtraining.

Of course, I ignore him 80% of the time because I can't stand to step away from a problem before it's solved, but he's always right.

How Quake (the videogame) changed my life forever. derelict-compendium.blogspot.com
129 points by aw3c2  11 hours ago   25 comments top 22
gmurphy 8 hours ago 0 replies      
Quake is also responsible for changing my life: after years of muddling around in BASIC and LOGO I started writing mods in QuakeC, which required me to write code, design a website, design and build models, and test and test and continually tweak and iterate on the experience. For the first time I was writing large code and doing serious graphics work, and I loved it deeply.

It lead me towards building websites and writing about games, and out of the Quake community I met people who ran Unreal websites - they gave me a copy of their CMS, and I ended up learning PHP/MySQL from it and building and designing bigger and better websites.

I dropped out of my Mechatronics/CS degree to pursue programming for a living - which lead to my first exposure to the then-revolting idea that programming and design were different disciplines. I spent years bouncing back and forth between the two, never quite fitting in, but learning a ridiculous amount along the way.

Now I'm at Google, technically employed as a software engineer but leading the design of a large product. I probably would have ended up somewhere in the software industry anyway, but I believe that I'm in this exact position, the best job I could possibly imagine, because of a chain of coincidences that were kicked off by Quake and its modding tools.

ambiate 8 hours ago 0 replies      
The quake community and game changed my life. Of course, when I started, it was mplayer. I had just traded my Playstation 1 for a 486DX, a huge mistake in everyone's eyes. I used my rich friend's grandmother's AOL login to get online. (This continued for 3-4 years, my mother still wonders what the "weird noise" was on the phone lines past midnight).

The broken physics and quirks of Team Fortress in QuakeWorld is what really caught my eye. I was hooked like a fiend. I was recruited by many guilds and known for my 9600BP lagging , teleporting and fragging! Not to mention my 1MB cirrus logic integrated video card, it chugged along at ~12FPS in 320x200(?).

I got interested in manipulating Quake. Living in MS, there were no mentors for learning to program or script. I went in blind and came out with a few mods.

Years later, I ported team fortress with quakeworld physics into Enemy Territory, (Feb/Mar 2004?), but never found anyone interested in doing the sound or graphics.

Obviously, my original endeavor into quakeC led to a whole new world of coding and languages!

At this point, I had a few life changes. I made a handful of lifetime friends from IRC and my old clans.

Now, looking back, it was Quake and my natural ability to tinker that led to my pursuit of a degree in computer science/bioinformatics. I am currently in my junior year.

I emailed John Carmack a few times asking for legal advice regarding using shareware Quake 1 models in a development version of my port of QuakeWorld to ET. He gave me good advice and has been a great influence.

Oh, and trying to figure out how to make VIS run faster on a BSP map was the end of me. VIS took forever, and I mean forever, to run on my 100mhz computer.

mambodog 5 minutes ago 0 replies      


Oh how I hated you.

harryh 5 hours ago 0 replies      
I wouldn't say it changed my life, but the first pretty serious piece of software I ever wrote was a game loader for Doom/Doom II/Heretic/Hexen. You could select which game you wanted to play, and which WADs you wanted and if the WADs were originally created for a different game it would run them through a conversion script for the game you wanted.

Later on I added support for DeHackEd so you could modify the exe to change things like weapon speed power. Pretty sure I had support for setting up multiplayer games as well.

It was all written in Turbo Pascal and had a really nice GUI where I programmed all the primitives (radio boxes/check boxes/scroll boxes/buttons/etc) myself from scratch.

I really really wish I still had source code to the thing, but I lost it years ago. I was really proud of it.

angrycoder 8 hours ago 0 replies      
Quake changed a lot of people's live. I don't think there is a single game out there that created jobs for so many people as Quake, the two most largest examples being Valve and Gamespy. Quake also pretty much single handedly got the the 3d video card revolution started.
ronnier 10 hours ago 1 reply      
Quake also changed my life. I bought my first computer to play Quake which got me into scripting and making video game websites. That got me interested in programming and lead me to getting a masters in CS and programming jobs while in school. Now I'm at Amazon thanks to John Carmack!
Tycho 6 hours ago 0 replies      
I like how he didn't even get started till he was about 24 and had no head start from previous work. It seems rare to read a success story that doesn't involve people getting obsessed with an activity in their mid-teens (often building on a good academic performance in maths or something like that).
Maxious 10 hours ago 0 replies      
http://gamessavedmylife.com/ is a growing collection of stories about how playing games has helped people emotionally, often in ways their creators probably didn't envision.

But of course, games and game modding has had a profound impact on a lot of technical folk. Many late nights bending BSP trees to my will in Valve Worldcraft ;)

robryan 1 hour ago 0 replies      
Similar story in a way, games like ff7,8 and 9 got me into RPG game making, which then got me involved in community websites which then lead to me learning PHP to help improve these websites.
emp_ 8 hours ago 2 replies      
IMO, his wife changed his life. Having dreams / passions is very common, having the push to pursuit them, very rare.
rgbrgb 6 hours ago 0 replies      
Pretty inspiring story but this really made me laugh:

"The kind of stuff that most people think is really cool now, but would immediately relegate you to punching bag status, and honestly not very cool with the chicks back then."

I think you probably just started spending a larger proportion of your time with people who share your interests. Fantasy novels are still not cool in high school. :)

pnathan 10 hours ago 0 replies      
I bought my first computer to play games. I thought I could do better, so I started modding games (X-Wing vs. Tie Fighter). I got tired of fighting other people's game ideas and wanted to make my own. And the path to that lay through a BSCS. Then I realized that there were more interesting and fulfilling aspects to programming besides games.

But my story isn't as awesome as the author of the article's.

staunch 6 hours ago 0 replies      
Count me as another. Quake was a huge part of why I loved computers. I also created and published a number of maps and seriously considered trying to become a pro level designer. Linux and web progrmming eventually became more interesting but that inspiration was critical.
davidhollander 8 hours ago 0 replies      
>I think it was either Qed, or Qoole...

Ah, Qoole was the first level editor I ever tried. What I vividly remember about Quake 1 though was all the mods! I spent hours tying up the phone on a 14.4K modem hunting for new stuff to try.

Grappling hooks, bots, friendly attack dogs, Quake Rally which converted it into a racing game, Air Quake which added pilotable helicopters and tanks... I didn't get into coding until UnrealScript, but Quake 1 definitely got me into the internet.

gavanwoolery 6 hours ago 0 replies      
I started out with tools like Deluxe Paint and Animation and QBASIC. I learned 3D modeling long before I touched a level editor, using tools like POVRAY and some crappy Windows 3.1 3D rendering/modeling package. I think the first level editor I used was Ken Silverman's for Duke Nukem 3D. But if I had to point out a game that really changed my life, it was Ultima 7 - it inspired me to learn art, programming, design, etc. It was so far ahead of its time and even was more interactive than many games are today. It was the closest thing to a "sandbox" game at that point in time, I think.
simonw 10 hours ago 0 replies      
For me it was Team Fortress Classic. I was in between A-Levels and University, not entirely sure what I wanted to do with myself and working a boring job in Office World (UK equivalent of Office Depot) - but in the evenings I was running a TFC clan, then later running a TFC news website. I ended up being hired by an online gaming dotcom which is where I realised that web development was what I wanted to do.
skrebbel 8 hours ago 0 replies      
Awesome wife. I'd say love, even more than Quake, changed the OP's life.

Great story.

pornel 10 hours ago 0 replies      
My story is similar. QuakeC was the first "C-family" language I've learned :)

I've been creating new weapons and battle modes on Amiga (in a tiny, tiny window) and playing those on PCs at school.

I've learned a lot about game physics, geometry and program design.

Kudos for making Quake programming approachable, portable and so much fun.

joeyespo 8 hours ago 0 replies      
Those who are against gaming are the ones who really need to read posts like this. Mario Bros for NES changed my life forever and it's wonderful reading about how others are affected. Even more interesting is how age is completely irrelevant to these experiences.
BasDirks 7 hours ago 0 replies      
In quite a different way Quake(3) shaped my life (being the engine for Call of Duty 4). I got paid to travel around Europe playing it for money.
TeMPOraL 9 hours ago 0 replies      
Computer games are what dragged me into programming in the first place. Later, Quake II source code, Unreal Tournament headers for native development and UnrealScript taught me lots of valuable lessons about game code design and programming in general.
aw3c2 11 hours ago 0 replies      
I (submitter) am not the author.
A tool that lets you automate the Internet nytimes.com
29 points by rmason  5 hours ago   8 comments top 6
westiseast 1 hour ago 1 reply      
I can't help read the privacy policy and see:

"In some cases, we may choose to buy or sell assets. In these types of transactions, customer information is typically one of the business assets that is transferred"

So basically, I authorize a large number of my social applications (facebook, twitter, google,tumblr, youtube, mobile phone) and iftt gets a single unified linked feed of ALL my data, along with behavioural data too (ie. I like to receive emails and texts at 11pm), which it reserves the right to sell. And for a website that doesn't have an obvious charging mechanism, what else can I assume except that their revenue stream will be selling my data?

Data mining trojan in my opinion..... it's a shame it's actually such an attractive tool.

duck 1 hour ago 0 replies      
derleth 3 hours ago 0 replies      
This is precisely what people in the early-mid 1990s were predicting would happen, with 'intelligent agents' and 'autonomous agents'. Wired was all over that kind of stuff, as I recall.


wolfparade 1 hour ago 0 replies      
I think the guys who made ifttt are HNers. If so how'd you all do it. How'd you get this NYTimes article?
EGreg 2 hours ago 1 reply      
This is cool, but check out Yahoo Pipes ;)
coob 3 hours ago 0 replies      
ifttt is fantastic, it make automation mind numbingly simple.

If you're looking for a different, more advanced kind of web automation, http://fakeapp.com/ is incredible.

Why Facebook's 'Frictionless Sharing' violates HTTP andothernoise.blogspot.com
164 points by alexandros  10 hours ago   58 comments top 24
skrebbel 9 hours ago 5 replies      
This article is so typically dork-disconnected-from-world that it's a bit sad, really.

Any site that uses SOAP violates HTTP (use POST also for GET-style requests). Any site that has delete links (as opposed to buttons) violate HTTP (barring onclick ajax action, that is). The big mega bucketload of sites violating HTTP this way (GET having severe side effects) is the very reason offline browsing and prefetching never took off much. Every site that uses comet very much violates HTTP.

This is no "message to the future to Mark Zuckerberg". It's just using the available tools for the job to their limits.

Note, I'm just as much against the whole frictionless sharing thing as most people here. But the moment we allow geek nonsense like this to overtake the real arguments is the moment the world will stop taking it seriously. This is what happened to the net neutrality debate ("net neutrality"? who made up that term? how will i ever make my mom care about that?), let's not let it happen here.

daveman692 9 hours ago 4 replies      
Sorry, but this really feels like hyperbole to me. For two reasons:

1) Before any action is shared back from the site to Facebook, the user has agreed to authorize that site (application) and add it to their timeline. Part of that dialog shows what's going to happen (https://developers.facebook.com/docs/beta/authentication/).

2) There are plenty of other examples around the web where submitting a HTTP GET request results in an action. For example, clicking an up arrow on Hacker News submits a GET request which increases the karma score on another author. What becomes more important is how you protect against XSRF and crawlers not accidentally changing state within your app.

michael_dorfman 9 hours ago 1 reply      
The important distinction here is that the user did not request the side-effects, so therefore cannot be held accountable for them.

But the user did request the side-effects, when he installed the app, and approved the TOS which specifically said that there would be "frictionless sharing".

Personally, I'm not interested in oversharing, so I'll avoid these apps; but, hypothetically speaking, if I were the kind of person who wanted his friends to know what he was listening to on Spotify, it would be a major pain in the ass to have to manually post a separate update to share the title of each song every three minutes. For this use case, it makes much more sense to install an app that will auto-post the songs, giving prior approval up front, and skipping the friction of asking again with each song.

To suggest that this violates HTTP seems more than a bit hyperbolic.

dasil003 6 hours ago 0 replies      
I'm always happy for another article denouncing frictionless sharing, but I think it's a weak argument.

On one side you have the fact that this is not technically an HTTP violation, because it is not the GET request itself that is triggering anything. It's javascript running in a loaded page after the GET request is completed.

On the other side you have the fact that Facebook does not care about what some tiny group thinks. Even if it were a true technical violation, Facebook doesn't give half a fuck about what web purists think; they will only change their behavior when there is a massive uprising among a significant portion of their user base.

Attempting to make it a technical issue detracts from the social contract argument which is the real issue, and also one that can be explained to anyone: you read a page and Facebook posts it to your newsfeed. Even your mom understands that explanation, and it probably freaks her out. We need more people trumpeting this from the tree tops (outside of geek blogs too) because I think Facebook is ruining it for the rest of us who would like to use some of these techniques to provide actually useful services instead of shitting all over user's privacy to make a quick buck before the government decides to get involve and lock everything down.

zhoutong 9 hours ago 2 replies      
If it's only a protocol violation, I guess Facebook can use JavaScript to generate a POST request instead. And in fact, they should be doing so.

The permissions and responsibilities of a user should not be associated with the type of request. Although it's true that users should not be responsible for a GET request, but the retrieval is based on the web page itself. The Facebook SDK will generate other requests automatically because the permissions have already granted. The GET request is only the trigger. (Imagine you have already agreed to post everything on ReadWriteWeb to Facebook through their reader app by a POST request, all these are hidden until you read an article. The side effects were already generated.) So I think this is not a protocol violation.

eddieplan9 8 hours ago 2 replies      
So Google Analytics might be the biggest violator of HTTP. Those bastards.
kevingadd 9 hours ago 1 reply      
This could matter in real life if prefetching web pages sets off frictionless sharing. Does anyone know if it does?
wandernotlost 8 hours ago 0 replies      
Let's stop for a moment and consider how the web actually works. Your browser sends an HTTP GET request for a page. The response for this page contains HTML and JavaScript. The browser then executes the JavaScript, often making other requests (GET and/or POST) in order to do its work. In the case of most Facebook operations (I haven't looked at this one specifically), JavaScript executed by your browser makes these requests on your behalf to communicate with and display information from Facebook. Google Analytics works this way too, making additional requests to a different server from the page you directly requested.

Unless the website in question has a back channel directly to Facebook, through which it communicates information about the GET requests you've made, the initial GET request has nothing to do with the data sent to Facebook. Furthermore, because of browser security limitations, this back-channel approach is not always even possible, since the original site does not typically know your Facebook ID (this may not be true for some sites with which Facebook has special partnerships, e.g. Yelp and TripAdvisor). Only requests made to facebook.com would have the cookie information that identifies you.

In summary, the HTTP GET request you make to fetch a web page has nothing to do with the information sent to Facebook, and HTTP's recommendations about GET requests have nothing at all to do with these other requests.

Also, people have been senselessly abusing HTTP for years.

justin_vanw 6 hours ago 0 replies      
What a bunch of nonsense. 'GET' must be idempotent, that is all.

Facebook is big enough that all sorts of sophists are going to come out of the woodwork and make plausible-ish arguments against everything they do. The motivation to do this isn't that it is good or something, it's because if you can say something bad about facebook, your comment can become news. That doesn't mean they shouldn't be scrutinized, but it also doesn't mean that we should pay attention to stupid articles like this. Or does the 'andothernoise' author think that collecting analytics 'violates http'?

j_baker 3 hours ago 0 replies      
A big reason for this is that it breaks a fundamental contract of web interaction, in place since the beginnings of the web, that users have come to rely upon.

How have users come to rely on GET? Ok, it makes a useful "Would you like to resubmit this action?" popup when you click the back button. And it can be useful to prevent XSF attacks. But neither of these is really relevant to frictionless sharing. Really the only thing I can think of that may be problematic is the possibility that spammers might be able to use it for spam. But then again, I'm pretty sure I violate HTTP on a daily basis. Maybe I'm one of the bad guys.

sek 9 hours ago 2 replies      
Maybe this will educate Users, the reality is that these actions have side effects already. They track you everywhere you go with cookies and create a profile of your surfing behavior.
The Facebook like button is so wide spread and they know every site you visit.

I would be very interested in my Facebook database dump.

T-R 8 hours ago 0 replies      
In regards to violating HTTP, Frictionless Sharing isn't in any way different from every other non-RESTful service/site on the internet. Facebook could have just as easily implemented the same service RESTfully (not violating HTTP), and it wouldn't be any different.

GET requests have the stateless/idempotence constraint so that they're cacheable, and timing between requests, out-of-order requests, and multiple requests don't become an issue. The 'accountability' referred to in the post is about users of the API - they shouldn't be expected to be aware of server state/side effects, in the same way that users of a function call shouldn't be expected to be aware of its implementation/side effects. It's an engineering constraint - it's not about honorably keeping end-users unaccountable for reading your webpage.

benwerd 10 hours ago 0 replies      
I bet it violates more than just a networking protocol.
chalst 8 hours ago 0 replies      
There is no violation of RFC 2616. Two points:

1. The implementors RFC 2616 is concerned with are: (i) of the server involved in a connection, (ii) of the client to that connection, and (iii) of the network that joins the two. Facebook's involvement as a server ended before these connections became live, and so is not one of (i)-(iii).

2. The notion of safety here is in the sense that inserting arbitrary get requests into a sequence of transactions between client and server will not change the semantics of those transactions. It does not have anything to do with what code the user might have asked the browser to run on receipt of the results of a GET request.

See, e.g., Mozilla's documentation of HTTP: A safe method is a method that doesn't have any side-effects on the server, from https://developer.mozilla.org/en/HTTP#Safe_methods

ramen 9 hours ago 0 replies      
Next up: Why web server logs violate HTTP.
mattront 9 hours ago 0 replies      
As if the rest of the web plays by the protocol rules. How about Amazon (and most other e-commerce sites) showing you the list of last viewed items? User state is obviously changed by GET requests. That's the reality of modern dynamic sites and I don't see any practical problem in this.

FoF (Fear of Facebook) is a really popular game these days :)

nightshift 9 hours ago 0 replies      

A quick look over the tutorial, it seems that a HTTP POST is actually made to declare that an "action" (e.g. read, watched, listened) took place.

You would also need to be authenticated into Facebook and the app would have to be authorized to post the action as others have pointed out.

jaekwon 6 hours ago 0 replies      
Does a URL shortener violate HTTP when it offers analytics on GET links to the public?

I think the OP should understand that HTTP is a leaky abstraction.

GET requests always have some side effect, either on the server end or the client end.

Server A and Server B.

A GETs B's data and and offers a mutated version of it, which B GETs, and further mutates. Side effects with no POSTS! This is natural too, in a distributed environment, since each agent should be responsible for mutating its own state, rather than have B POST something to A. You can't abstract this dynamics into HTTP, but you can use HTTP to handle the underlying communication mechanics, like caching and authorization.

spullara 6 hours ago 0 replies      
Most web servers "violate http" in this way when they log the fact that they served the GET request. Very little is done on the web without some side effects.
mikeocool 6 hours ago 0 replies      
Using this argument, essentially every website that uses Google Analytics violates HTTP. The GA javascript triggers a GET request every time you visit a page that it's installed on, and the GET request has the side effect of recording your visit.
jarsj 9 hours ago 0 replies      
Like it's so hard to differentiate between a GET that user was not responsible for and which didn't complete, to the one the user was responsible for. As long as facebook makes it easy for an user to turn off an app that's spammy or the one that knowingly posts "I read an objectionable content", I think we are fine. Agreed, apps have more power but as long as users and Facebook have more, they will behave and internet will become a better place.
dorkitude 7 hours ago 0 replies      
The worst HTTP violator by far is the little "up" arrow I didn't click to upvote this article.
hamidnazari 9 hours ago 0 replies      
Nice find, but I don't really think this is violating anything. It's against the HTTP Spec, sure, but you give it permissions to post stuff on your wall for you.

This reminds me of this guy I met a few months ago who was freaking out about the fact that he saw his facebook profile photo and some of his friends' photos on some website and he was all like 'facebook has sold my information to this website'. Took me a while to convince him what was going on. Please let's try not to make non tech savy people freak out by this kind of allegations.

poona 8 hours ago 0 replies      
Maybe this will educate Users, the reality is that these actions have side effects already. They track you everywhere you go with cookies and create a profile of your surfing behavior.
The Slow Way To SPDY taoofmac.com
33 points by ditados  6 hours ago   2 comments top
justincormack 5 hours ago 1 reply      
I am waiting for Go to get an implementation... Looks like it should happen http://groups.google.com/group/spdy-dev/browse_thread/thread...
A Gentle Introduction to Symbolic Computation cmu.edu
41 points by jwdunne  8 hours ago   4 comments top 2
calpaterson 7 hours ago 1 reply      
I learnt programming from this book - it is excellent.
hsmyers 7 hours ago 0 replies      
While I'm busy kicking the kids off of my front lawn, let me pause to say that it fails in the same way that Emacs fails---first thing I want to see in a book on a process (language, editor, what-have-you) is how to get it started and how to get out. Makes me a curmudgeon, but hey I'm old and came by it honestly... OBTW, it is otherwise a great read and since it is generic to all Common Lisps, of course it doesn't have the necessary how to start and how to stop---(clisp [if installed] and an eventual (quit)), for those who were wondering :)
Facebook Disconnect google.com
277 points by jmonegro  20 hours ago   83 comments top 19
mdasen 20 hours ago 3 replies      
The author of Facebook Disconnect (Brian Kennish) has written another Chrome Extension called "Disconnect" (https://chrome.google.com/webstore/detail/jeoacafpbcihiomhla...). Disconnect not only deals with Facebook, but also Google, Yahoo, Twitter, and Digg tracking.
orijing 18 hours ago  replies      
"This extension can access: Your data on all websites"

This part made me chuckle a bit. We are so afraid of Google and Facebook tracking our searches/web pages, yet we freely install plugins from 3rd party developers that can easily gather everything that Google and Facebook can get, and more. In theory, I could make a Facebook Disconnect 2, which secretly sends data back home about what pages have been visited, and nobody except the most vigilant (enough to read the source of the plugin) would know.

Why do we not trust large corporations who have billions of dollars at stake, but trust independent developers who have little skin in the game? Is it because we are those developers, so there's some form of camaraderie?

Luyt 20 hours ago 0 replies      
Also see http://www.ghostery.com/ if you don't want to be tracked by web beacons in a more general way, i.e. not only by Facebook.
exit 15 hours ago 0 replies      
when i want to log in to fb i open an incognito window. i haven't looked into this myself but the assumption is that cookies from incognito will not leak into my normal session.

it would be great if chrome allowed users to create a separate "sandboxed" browser session in each window. i'd like to maintain just one session for each service i log into, including google/gmail.

hmm, maybe that's why they haven't implemented this.

jmonegro 7 hours ago 0 replies      
The author of this extension is also the author of Disconnect.
missy 16 hours ago 0 replies      
There is also another interesting site:


This German site uses a double opt in button for button like "Like ". Press twice on the grey like button and then it only turns into a normal like.

Urgo 20 hours ago 0 replies      
Available for Firefox & Safari as well from the authors site: http://disconnect.me/
nextparadigms 8 hours ago 0 replies      
I use WidgetBlock. I'm not sure if it does exactly the same thing, but I use it against sites that are heavy with widgets and scripts and make the site load 5x slower (like Techcrunch, although I barely even visit it nowadays).


I'm going to give Facebook Disconnect a try, too.

ecocentrik 19 hours ago 0 replies      
How does Disconnect compare with Chromeblock?

I find it kind of amusing that facebook doesn't display integrated comments at all if you block their cookies. Not to worry, TechCrunch w/out comments ≈ TechCrunch with comments.

eli 10 hours ago 0 replies      
This broke parts of the Disqus admin page last time I tried it.
irrumator 20 hours ago 7 replies      
This recent Facebook smear campaign is interesting to watch on HN. Is it the work of an organized group, or just the hivemind's gobbling up of anything anti-Facebook? Either way, it's poor form and not news, this plugin and its more broader 'disconnect' sibling have been linked before several times.

Nb: I have no skin in this game, I personally don't have a Facebook account, but it's not because I'm some anti-FB zealot.

The frontpage of HN has been very disappointing in the last week with non-substance links littering it, and even less worthwhile comments accompanying them. Let's try not to upvote such frivolous and low-signal links.

skrebbel 18 hours ago 3 replies      
Ok so why would one use a browser made by Big Evil Privacy-hating Spy Firm #1 and then install an extension to prevent logging by Big Evil Privacy-hating Spy Firm #2?

I'd be very amazed if Chrome would not, now or at some point in the (transparently and unstoppably auto-updated) future, keep track of what you're doing, too.

missy 16 hours ago 0 replies      
In Europe there is a similar movment but it comes from the EU. There has been a big change in the handling of cookies and other privacy issues, so that now you are only allowed to save data if when someone visits your site and selects some pop to allow the site to save it. big problems with analytics


matmann2001 19 hours ago 0 replies      
Does anyone know of an Opera Extension like this?
kaitari 17 hours ago 0 replies      
I've already developed the habit of only accessing Facebook in an incognito tab, but cool extension nonetheless.
poona 8 hours ago 0 replies      
In order to block Facebook, this extension is injecting javascript into every page you load. It absolutely should come with a large warning.
law 19 hours ago 1 reply      
Absence of malicious history/intent doesn't render them incapable of being (directly or indirectly) dangerous. One could easily pose a rational argument for users to take prophylactic measures (e.g., Facebook Disconnect, Ghostery, and the other browser plug-ins) based solely on the increasing number of data breaches.[1][2]

However, this is a relatively weak argument, as it requires making an underlying assumption that the leaked data is dangerous. We have no evidence to support the assertion that leaked information of the kind shared on Facebook would pose any danger to the affected users. This is fundamentally different from the dangers of data breaches concerning health and financial records; these records contain information necessary to steal identities and engage in other nefarious operations. Facebook doesn't collect social security numbers or other extremely sensitive personally identifiable information.

Facebook does, however, collect evidence of our predispositions and predilections. Arguably, this information is far more dangerous than mere personally identifiable information, because rather than identifying us outright, it gets to the heart of what makes each of us unique. We are incapable of imagining the complete set of scenarios where this information could be used nefariously, and as such, its dangers fall within the scope of ``unknown unknowns''

Compare this situation to the case of a breach of financial data: the uses of this data are well-enumerated, and one could argue that the cost of the next health information data breach is a known unknown. Based on historical evidence, we know that another breach will occur, and we know how criminals use the leaked information. With data on Facebook, however, we don't know whether this information could be used maliciously. Moreover, if it could be used maliciously, we don't know how it might be used. Therefore, it deserves as much privacy (if not more) as one's financial and health records.

[1] http://www.idtheftcenter.org/artman2/publish/lib_survey/ITRC...
[2] http://www.privacyrights.org/data-breach

poona 8 hours ago 0 replies      
You can copy the source code and make your own plugin from it ;)
power78 18 hours ago 3 replies      
I don't really like that he puts tracking javascript in his addons. Take a look at the source of this Facebook Disconnect addon, its at the bottom. Why don't people just write a simple bash script that toggles blocks for the facebook domains in the hosts file?
Google And Monopoly Theater techcrunch.com
25 points by zeratul  5 hours ago   3 comments top 2
zeratul 27 minutes ago 0 replies      
The committee was unaware that Google mixes two distinct services: key word search and question answering.

Key word search approximates your question and returns approximate answer. The amount of interpolation of approximations forces to display a long list of links. Here, the job can be done by simple ranking algorithm.

On the other hand, if we want to answer a concrete question we need to provide a concrete information. Watson would not win Jeopardy if "he" provided just a ranking. Here, the job is done by understanding the question. After that we just need a look up table. Google is not secret about their data model. If your web site is fully annotated with OWL and RDF, surly Google can use it for question answering look up resource.

gbog 1 hour ago 1 reply      
The finance snippet could have the same layout as ads, this way it is more obvious it is not in the ranks, and real ads clicks might even increase.
When Education-Technology Startups Fail mfeldstein.com
30 points by rafaelc  7 hours ago   6 comments top 5
techiferous 5 hours ago 0 replies      
To solve your customers' problems well, you have to understand their problems from their point of view. This is why technical people often need a business co-founder: one who understands the domain well.

I had a brief stint as a teacher (three years in a private school) so while my experience is not vast, I do have a sense for the teacher's point of view.

Teachers often do resent spending their own money on their classrooms, but it's not out of a sense of entitlement. Often the budget is tight, so teachers have to buy pencils, classroom posters, etc. out of their own small paycheck. Also, simple things like how many copies you can make are limited.

So imagine a software shop where each developer:

(1) had their bandwidth per month capped

(2) if they wanted an extra monitor, productivity software, or anything beyond the standard developer setup they'd likely have to pay for it themselves

(3) has their salary cut in half (or by two thirds)

So it's not a case of entitlement at all; it's a case of scarce money.

About teachers not wanting technology solutions: as a teacher I had very little time during the day. There were some days where I very literally did not have five minutes to spare and had trouble finding time to go to the bathroom (other days were more sane). My personality is one that craves change, but as a teacher I found that I was juggling so much work that I did not necessarily welcome changes to existing procedures. So I think there may be some truth to this in that teachers may not have the time resources to adapt to a lot of new technology. That said, target the teachers in the summer. They've got time to make changes then.

danielford 4 hours ago 0 replies      
Maybe I'm approaching this differently since I teach college. My initial response was, "Why was that guy charging a subscription fee for a gradebook?" Granted, the gradebooks that come with the Angel and Blackboard learning management systems probably aren't as good as his was. Still, they do what I need them to and I'm not paying for them.

I always figured K-12 teachers had similar services available. Assuming they don't, it's not that big of a deal to throw together a gradebook in Excel.

I feel bad for this guy; based on the demonstration video it looks like he put an awful lot of work into producing a well-polished product. I just don't see how using it would save me any time or improve my teaching.

derBaumstamm 6 hours ago 1 reply      
Knack for Teachers failure seems indicative of a larger trend in the start-up world, the explosion of doomed-to-fail start-ups founded by mediocre programmers with no understanding of their target market. These founders believe that their subpar efforts will somehow lead to the next airbnb or dropbox, ignoring the careful planning and sheer brilliance at the heart of all great start-up successes.

My fear is that just as interest in ed-tech is waxing, the field will be flooded by these want-a-be founders who fail miserably and then blame their failures on parent, teachers and children. Education is a complex and difficult market that will requires a founder to have deep knowledge of its inner workings. For the technological innovation that is desperately needed in education to be successful, our best hackers must step forwarded and accept the challenge. Nothing else will do.

I worry. However, John Resig's work at Khan Academy gives me hope.

tryitnow 5 hours ago 0 replies      
Blaming your customers for failure is never a good idea.

I looked into doing a K12 school related startup and quickly concluded that it was a no go. The only money is in school budgets and the sales cycle is just too long and onerous.

The probability of success for an Edtech startup is inversely related to its proximity to schools.

mise 5 hours ago 0 replies      
patio11, your thoughts? I think you sell directly to teachers.
Rails is not MVC andrzejonsoftware.blogspot.com
116 points by andrzejkrzywda  15 hours ago   42 comments top 11
zzzeek 11 hours ago 4 replies      
practicality beats purity, even though the Python
frameworks are moving away from the "MVC" term it always
felt like "MVC" to me, in that there are three distinct
components - data objects (model), some kind of "load the
model in response to a URL and display a template"
(controller, what better name is there, I think "view" is
a crappy name for that since it doesn't define
presentation), then the "template", seems like a view to
me - it's the thing you're viewing!

That the model isn't notifying the view through an event
system is splitting hairs. The GOF book is much maligned
I think because people insist on taking each pattern
completely literally down to the last detail. GOF's
pattern (GOF didn't create it but they discuss it on page 4, I thought I was crazy until I just checked just now) is specifically "MVC, and because we only know
about C++^H^H^H Smalltalk graphical libraries and not very much about
stateless HTTP systems yet in 1993, the model notifies
the view of changes too, how else would it work ?" IMHO.
It's an implementation detail, it's not the essence of
the pattern.

So if a JS framework is now doing MVC that includes the
concept of "model notifies the view", sure call it
"classical MVC" or "model-notify-view-controller" (MNVC).

simonw 13 hours ago 5 replies      
This is exactly why Django used to call itself an "MTV" framework (Models, Templates, Views) rather than an "MVC" framework - we realised that the classic MVC pattern didn't really apply to the Web.

We lost that argument because no one cared - in fact, I think we probably caused a whole load of confusion by not using the same terminology as all of the other frameworks.

dasil003 12 hours ago 3 replies      
I don't see the point in making these subtle pattern distinctions. The model notifying the view is not significant enough of an architectural criteria to merit its own term. I submit that there are dozens of such architectural variances between MVC frameworks and that defining high-level terms to codify those differences isn't helpful. I'll even take it a step further and suggest that a focus on taxonomy diminishes ones ability to think fluidly about the low-level architecture and how to improve it to solve different problems.

This is quite different from (for example) the debates about the definition of REST. In that case there is a seminal paper and all kinds of subtle points leading to material benefits in web apps. What does Model2 have to teach us?

extension 10 hours ago 1 reply      
Here is my personal understanding of the principles of MVC, without regard to any specific platform or implementation:

The model is an API to the domain of the application that is not coupled to anything else. It is suitable for use with a user interface, automation, or as a component of a larger model. If the application domain is about storing data, then the model will provide access to that data and enforce its validity. But the domain could also be something that is not stateful per se, like interfacing with hardware.

A view is an independent component of a user interface for the model, and is tightly coupled to it. It allows humans to interact with some part of the model in some way.

The controller organizes views into a complete end-user application by instantiating them and connecting them to models. It should be minimally coupled to the details of views and models, and is optionally composable.

pestaa 14 hours ago 1 reply      
He is correct, but I doubt there is much confusion. The term MVC is used mainly to illustrate the separation of concerns, which is, IMO, the number one task for any framework.

Also, as far as I recall, the MVC pattern does not require communication between the models and the views, although it certainly allows it.

Looking at it from another perspective, MVC is a huge buzzword nowadays. It's simply not worth it to be conceptually correct for such a little gain.

strictfp 12 hours ago 3 replies      
In my opinion the term MVC should be avoided. It was badly defined from the very start and has been misused to the point where it is mostly a buzzword. Beginners spend precious time trying to understand the concept using flawed guides and tutorials, whereas experts dismiss the term as being to unspecific. My favourite CS teacher tought us MVC simply by suggesting that we made a command-line client before building a GUI. There must be more practical ways like this which one can use to explain the idea.
andrzejkrzywda 10 hours ago 0 replies      
OP here.

There's one more reason why I want to clarify the situation with terminology.

Apart from being a Rails developer and running a Rails company I also teach Ruby on Rails at a university.

Obviously, I'm not the single source of knowledge for the students, however I want to clearly explain to them what is Rails, what is MVC and what is Model2.

It doesn't help me that when they go to any Rails website there's information that Rails is MVC, which is not.

Here and on my blog you can find comments from people who are confused with the current situation. I'm not expecting that today we're going to agree on any solution. I just want to point out that the definitions and their usage are not precise.

sjs 11 hours ago 0 replies      
Ceci n'est pas une pipe.
nakedslavin 8 hours ago 0 replies      
what a strange feeling this post gave me. i've read it, then i thought about other frameworks, even wanted to argue, that purity in terminology is overestimated, and you know what? who cares.

i dont know if thats because client side frameworks are becoming much more important, and talks about differences between django and rails sound more and more pathetic, or maybe its a personal thing and i just need to get some sleep )

hello_moto 11 hours ago 1 reply      
Is JavaScript MVC really JavaScript MVC (I've never used any so I don't know)?

And is MVC the right pattern for rich UI app (GUI or Web client)?

The last time we tried to use MVC in GWT app, it didn't work out quite well. We decided to use MVP + EventBus instead.

These days the MS camp came up with MVVM (a variation of MVP or more closely related to Presentation Model, which I think MVVM is borderline architecture astronaut, but meh, I might be biased).

MVC seems to suit widgets level better as opposed to the architecture pattern for a Rich UI app.

rgbrgb 11 hours ago 0 replies      
A similar discussion about Cocoa MVC: http://stackoverflow.com/questions/353646/design-patterns-fo...

The point is that it gets you to think about design patterns that the framework writers intended.

Anybody know if the rails MVC pattern was based on Cocoa? I know a lot of Rails devs who are also Cocoa/Touch devs.

Genomics - The new biological science, a decade after human-genome project economist.com
14 points by joeyespo  5 hours ago   5 comments top 2
bhickey 1 hour ago 1 reply      
The Broad is an amazing place, and I'm very grateful for my time there. The folks there simply do remarkable work.

Nevertheless, I'm ambivalent about the size of the grants doled out to the senior faculty there. I've heard post-docs and junior faculty express concerns that it's a poor allocation of resources. For one $25m grant issued to a senior faculty member, you could fund 50 junior faculty for years. If diversification is the only free lunch in finance, why should science be any different?

throwaway32 4 hours ago 1 reply      
this technology is very cool, and has a lot of future promise for treating illness and detecting disorders before they manifest themselves.

One thing i an extremely concerned about is what other uses this kind of data will be put to, is your genetic data one National Security Letter away from being put in a government DNA database? I'm also certain that insurance companies would really like to get their hands on this kind of data/analysis. What about certain kinds of jobs, will you be required to submit a genetic profile to prove you can preform your duties?

I think a situation like Gattaca[1] is not too far off if we dont tightly control who is allowed access to this kind of information.


e: corrected "gattica" typo

HaXe 2.08 ncannasse.fr
50 points by swah  11 hours ago   10 comments top 4
jxcole 7 hours ago 0 replies      
HaXe is a great platform. I vastly prefer it to javascript because I am a big fan of static typing. The only thing that irks me about their language is that they have both interfaces and class extension but for some reason decided not to have abstract classes.

This irks me because it seems to be a design choice; they aren't doing it because they haven't gotten around to it, they aren't doing it because they don't want to. I just don't get why they would leave this feature off but still have extension. (I am totally fine with Go, which doesn't have abstract classes but also doesn't have extension).

It's only a minor quibble I suppose, I still recommend developing in HaXe if you can.

shin_lao 9 hours ago 0 replies      
HaXe is the technology behind the great game Hordes (http://www.hordes.fr/, French only unfortunately).
Cushman 8 hours ago 2 replies      
Something about this gives me the willies... It feels like a language designed on indie game forums for non-programmers who can about wrap their heads around the basics of ActionScript. The value it brings doesn't really make sense for a non-trivial project-- compiling for, say, Flash and <canvas> sounds nice, but in what circumstance would I really want the same code to target PHP, AS3, and C++?

Even the name seems to be saying "Using this will make your game cooler!"

On the other hand, when I say that out loud, I probably sound like quite the elitist... it's not doing any harm to me, and getting more people into coding has gotta be a net good. Maybe I'm just not looking at it right as a programmer rather than someone less technical who just wants to get their game out to as many people as possible. Can anyone set me straight on this?

Whoever is downvoting me is exactly who I mean.

gregwebs 7 hours ago 1 reply      
Apparently there are technologies that can compile this to C++, javascript + CSS, Flash, webOS, and Android.

I haven't used this, but javascript + types (with type inference) is very appealing.

       cached 26 September 2011 02:02:01 GMT