hacker news with inline top comments    .. more ..    27 Oct 2013 News
home   ask   best   6 years ago   
C-- cminusminus.org
288 points by msvan  6 hours ago   51 comments top 17
carterschonwald 2 hours ago 1 reply      
Hey everyone, I'd like to point out that the c-- domain is no longer cminusminus.org, the historical site can be found on norman ramsey's homepage here http://www.cs.tufts.edu/~nr/c--/ ! It also has actual reading material / papers!

The cminusminus domain is no longer valid (though it has more modern CSS), also it lacks links to all the informative papers!

C-- is very similar overall to LLVM IR, though there are crucial differences, but overall you could think of them as equivalent representations you can map between trivially (albeit thats glossing over some crucial details).

In fact, a few people have been mulling the idea of writing a LLVM IR frontend that would basically be a C-- variant. LLVM IR has a human readable format, but its not quite a programmer writable format!

C-- is also the final rep in the ghc compiler before code gen (ie the "native" backend, the llvm backend, and the unregisterized gcc C backend).

theres probably a few other things I could say, but that covers the basics. I'm also involved in GHC dev and have actually done a teeny bit of work on the c-- related bits of the compiler.

relatedly: i have a few toy C-- snippets you can compile and benchmark using GHC, in a talk I gave a few months ago https://bitbucket.org/carter/who-ya-gonna-call-talk-may-2013... https://vimeo.com/69025829

I should also add that C-- in GHC <= 7.6 doesn't have function arguments, but in GHC HEAD / 7.7 and soon 7.8, you can have nice function args in the C-- functions. See https://github.com/ghc/ghc/blob/master/rts/PrimOps.cmm for GHC HEAD examples, vs https://github.com/ghc/ghc/blob/ghc-7.6/rts/PrimOps.cmm for the old style.

m_mueller 5 hours ago 2 replies      
Could someone enlighten me what's the advantage of this over LLVM-IR?

Edit: Ok, I've found the following SO thread: http://stackoverflow.com/questions/3891513/how-does-c-compar...

sambeau 5 hours ago 1 reply      
According to this, C-- is still a large part of the Glasgow Haskell Compiler. It looks like (Fig 5.2) code goes into C-- before being translated to LLVM.


paulhodge 6 minutes ago 0 replies      
Would C-- be a good choice for JIT machine code generation, or is it mostly for static compilation?
gdonelli 1 hour ago 0 replies      
"As of May 2004, only the Pentium back end is as expressive as a C compiler. The Alpha and Mips back ends can run hello, world. We are working on back ends for PowerPC (Mac OS X), ARM, and IA-64. Let us know what other platforms you are interested in."
peapicker 5 hours ago 1 reply      
When I first entered college in 1988 there was a small DOS compiler floating around called C-- back then, which I got from some BBS (yes, a BBS, how antiquated!), probably in 1989. It was a mix of a subset of C and proto-assembly. I have looked for it a few times over the years, and this one isn't it, although it has some similar ideas. It makes me wonder how many other little-known C-- projects there are.
secoif 5 hours ago 3 replies      
According to this infographic, a C-- was the influence for JavaScript.


If this has any truth (perhaps a different c--?) I'd like to know which one is being referred to.

protomyth 2 hours ago 0 replies      
I must be missing something since I see the line "The specification is available as DVI, PostScript, or PDF.", but cannot find any download link.
cylinder714 3 hours ago 0 replies      
Another take on portable assembly languages is Dan Bernstein's qhasm: http://cr.yp.to/qhasm.html

An overview here: http://cr.yp.to/qhasm/20050129-portable.txt

Radim 5 hours ago 2 replies      
The originality (and practicality!) of choosing the name "C - -" leaves me speechless.
Dewie 4 hours ago 1 reply      
Say you're writing a compiler for a language in Haskell, and you want to generate machine code rather than having it be interpreted. Is C-- a natural choice on this platform? Or might LLVM be a better choice?
EGreg 40 minutes ago 0 replies      
If the language is called C-, how come the website is named Cminusminus?
vezzy-fnord 5 hours ago 2 replies      
This is pretty old. I don't know if anyone else besides the GHC team use it?
EGreg 39 minutes ago 1 reply      
How does C- compare to CIL of .NET?
ErsatzVerkehr 3 hours ago 0 replies      
Where can I find a code example?
mrcactu5 5 hours ago 3 replies      
this seems really awesome, but I have no idea what it does?

I know that Python compiles to C and that Clojure compiles to JVM (or even to JavaScript).

My cartoon:

  scripting lang --> programming lang --> native code
Honestly, I have never experimented with Assembly language much except for COOL (http://en.wikipedia.org/wiki/Cool_(programming_language)) and TOY (http://introcs.cs.princeton.edu/java/52toy/).

davebees 4 hours ago 1 reply      
Is the 'minus minus' being converted into a dash throughout?
Lou Reed, Velvet Underground Leader and Rock Pioneer, Dead at 71 rollingstone.com
26 points by coloneltcb  1 hour ago   7 comments top 4
mcphilip 1 hour ago 0 replies      
RIP Lou Reed, one of my favorite musicians. The song Heroin has so much dissonance and noise throughout that it's almost revolting to listen to. Over time it grew into one of my favorite songs in that I learned to find beauty in the clash of consonance and dissonance. At the risk of hyperbole, that mindset of finding something new from the conflict of opposing forces is something that now applies in far more areas of my life than just music, and the VU was the vehicle that introduced me to that philosophy.
crapshoot101 1 hour ago 0 replies      
Damn it - amazing musician. Hope he's taking the proverbial walk on the wild side.
ssully 1 hour ago 1 reply      
Damn you Sony and your Perfect Day commercials...

I think now's a good time to go listen to Transformer again.

cprncus 1 hour ago 2 replies      
I don't see how general celebrity deaths are hacker news. I would like to read about tech and computer news and ideas; if I wanted this, I would read general news sites.
False detection on file tcpip.sys kaspersky.com
15 points by ColinWright  52 minutes ago   6 comments top 4
timsally 12 minutes ago 0 replies      
Ouch. To make things even worse, the executable to fix the problem is (1) not delivered over HTTPS and (2) not signed by Kaspersky (it is unsigned). So not only are you unable to receive an authenticated update to your antivirus database to fix the problem because your network is down, but you have no assurances that the offline tool provided to you to fix the problem actually came from Kaspersky.

One of the tools regextr.exe is signed, but the tool kaspersky_tcpip_fix.exe that they have you run first is not.

pudquick 2 minutes ago 0 replies      
Still not quite as bad as when McAfee DAT 5958 misidentified svchost.exe (the parent process for all DLL-based services) on XP as malicious. As you can imagine, this didn't go over well. I remember our shop being very glad we were on a delayed deployment for their DATs.


Scaevolus 24 minutes ago 2 replies      
It's odd that they don't have a database of "known good" files taken from a clean install of each OS they support that they test against before releasing an update.
vezzy-fnord 29 minutes ago 0 replies      
For other similar gaffes, see: http://attrition.org/errata/sec-co/

It's amazing what kind of things slip by.

Spyder Scientific Python Development Environment code.google.com
44 points by rbanffy  3 hours ago   11 comments top 7
Rickasaurus 2 minutes ago 0 replies      
Spyder is a really nice interface, but it has one huge problem (on windows at least), you can only have one instance of it open at any given time. It's hugely irritating for experimentation and so mostly I use IPython Notebook these days.
jofer 17 minutes ago 0 replies      
Spyder is great for people coming to python from matlab. I reccommend it a lot to complete beginners, as well.

For those not aware, it's basically an offshoot of python(x,y), which is a really nice python distribution for windows.

Personally, I'm far, far too wedded to vim + ipython to use anything else, but it is _really_ nice to be able to point people using windows to python(x,y). I have nothing against commercial distributions like canopy or anaconda (which offer many advantages), but there are a lot of cases where a freely-redistributable option makes more sense.

elyase 2 hours ago 0 replies      
Windows users can also consider Visual Studio + Python Tools [1]. Another multiplatform alternative is Enthought's Canopy (Free Academic License) [2]. I consider both to be superior alternatives to Spyder for scientific python usage. And of course the iPython notebook [3] which is my favorite alternative but you still can't easily inspect the variables, I hope it comes soon with the javascript enhancements.

I appreciate the great effort behind Spyder but I think the UI and the documentation, website etc lack a lot of polishing and attention. I have tried it a couple times and I never get around simple things, like installing packages and the environment.

[1] http://pytools.codeplex.com

[2] https://www.enthought.com/products/canopy/

[3] http://ipython.org/notebook.html

wirrbel 3 hours ago 0 replies      
I use spyder with Scipy and it generally works quite well. Normally I am a vim user and do not like IDEs, but Spyder make switching between data, sources and repl really neat.

On the other hand, only with spyder I get that many annoying trailing spaces for some reaseon.

thearn4 19 minutes ago 0 replies      
My students this summer used python/numpy/scipy with spyder in place of MATLAB. I was pretty impressed with it overall.
alexrson 3 hours ago 2 replies      
Does anyone with experience with this IDE have an opinion about whether it is superior to pycharm?
emansim 32 minutes ago 1 reply      
It doesn't work on OS X Mavericks.
Fifty Three can now print Moleskin with your design in it fiftythree.com
5 points by kirillzubovsky  13 minutes ago   1 comment top
spodek 2 minutes ago 0 replies      
Am I missing something with Moleskine notebooks?

I kept a diary for two decades, going through over two dozen notebooks in the process. In graduate school I also kept lab notes for years. I used cheap notebooks. Never did I see the value of Moleskines justifying their cost.

Am I missing something?

They only remind me of this Onion article -- http://www.theonion.com/articles/privileged-little-artiste-w...

Privileged Little Artiste Writing Something Oh-So-Precious Into His Moleskine Notebook

SAN FRANCISCOAfter gently unfastening the elastic strap keeping his dearest musings safe from prying eyes, little literary artiste Evan Stansky penned a few more darling thoughts into his clothbound Moleskine notebook Wednesday. "These are much higher quality than the notebooks you find at CVS," lilted the auteur, who couldn't be bothered to usedare it be saida journal of lesser craftsmanship or pedigree, or one not famously used by such legendary artists as van Gogh and Hemingway. "They're a little more expensive, but I try to write on both sides so I don't go through them as quickly." At press time, the princely scribe was seen finishing his apricot jasmine tea, asking a mere mortal sitting nearby to watch his literary accoutrements, and then prancing off to the Starbucks powder room, light as a feather.

Clojure from the Ground Up, Part 2: Basic Types aphyr.com
87 points by mrbbk  6 hours ago   16 comments top 5
antics 12 minutes ago 0 replies      
I am a member of one of the underrepresented groups aphyr mentioned. Let me tell you what effect the discussion this HN comments thread has had on me.

I read aphyr's introduction. It was kind of like a warm "you can do it". It made me feel kind of warm fuzzy that someone was looking out for people like me, and other groups that are more obviously disenfranchised than mine (mine has it comparatively well off I think).

And because of this encouragement, I read the entire post. So was it effective, at least for me? Yes it was.

Cue HN comments. I read them and my first reaction was don't understand why it's not ok to say what aphyr said. Never mind that he put it in the "who is this for" section, which seems (to me) to be an eminently appropriate place to put such a thing.

But as I read more of them, I began to wonder whether what I thought and what other people think are so different that I'm just never going to fit in with this community.

I began to doubt myself. Eventually the entire effect of the introduction was reversed. Soon I felt worse than when I started.

Then today happened. Let's consider some facts.

* I saw yesterday that aphyr wrote a book, and in his "who this is for" section he wrote that it's partially to encourage underrepresented groups to program.

* That was part 1.

* This is part 2.

* Not only were the HN comments on part 1 dominated by this issue, but also the HN comments in part 2.

* So, merely writing this once is enough for the issue to follow you around in subsequence posts.

As a member of one of these underrepresented groups I'm both shocked and -- honestly? -- kind of hurt. If aphyr can't write this in the "who is this for" section, then where is it appropriate to have this discussion?

People of HN, you may not be convinced that this was the right thing to do, but do know that this type of discussion is actively hurting your ability to be diverse.

diego 30 minutes ago 0 replies      
Useful tip:

"By default Clojure operates with natural numbers as instances of Java's long primitive type. When a primitive integer operation results in a value that too large to be contained in a primitive value, a java.lang.ArithmeticException is thrown. Clojure provides a set of alternative math operators suffixed with an apostrophe: +', -', *', inc', and dec'. These operators auto-promote to BigInt upon overflow, but are less efficient than the regular math operators."

So you can write

(inc' Long/MAX_VALUE)

instead of

(inc (bigint Long/MAX_VALUE))

merlinsbrain 1 hour ago 1 reply      
Has anyone at all managed to get past the first paragraph? It's his blog, his tutorial, his sweat disseminating knowledge to everyone for FREE. So he encourages a certain group of people. I don't see any of them reacting. Take the education and thank aphyr for his efforts.Way too many people take that for granted these days.
the-original 5 hours ago 1 reply      
It's an okay intro I guess, but there's a complete guide written for complete noobs. http://www.reddit.com/r/Clojure/comments/1pb8sx/the_original...
lectrick 2 hours ago 2 replies      
I have mixed feelings about the fact that right off the bat it sledgehammers home the fact that programming is for everyone and not just white anglo saxon males.
Repobuild Declarative-style build system, like Google Build github.com
25 points by albertzeyer  3 hours ago   4 comments top 3
ChuckMcM 16 minutes ago 1 reply      
Interesting, if Chris can pull this off it will change the world in a huge way. Having used the build system inside of Google it was somewhat mesmerizing, you really could put together things pretty easily. I suspect there is something similar inside Amazon. But I completely agree that Github has done wonders for making many smaller projects accessible it has not significantly increased our ability to tie them together. That seems to be part of the motivation for the CCAN effort as well, although on a more language specific vector. But as a proof of concept the CPAN system (the one for perl) demonstrates just how powerful this function can be in enabling large scale rapid development.

The flip side though, the monkey in the wrench as they say, is that the projects that participate have to conform to the rules of participation. In CPAN's case its a standardized dependency, build, and install models with required unit tests. So perhaps the first step here is to provide an incentive for clean build processes on a project.

nercury 25 minutes ago 0 replies      
This looks great! May be a reason for me to ditch CMake, but I kind of need Windows support :/. Is it much work to add?

I wish someone made a similar way to share libraries like we can do in PHP (https://packagist.org) or Ruby (http://bundler.io/), but for C++.

saljam 49 minutes ago 0 replies      
Judging by his description of BUILD, it sounds like was a big influence on Go's build system.

Well, no surprises there!

Nissan Tests 48 kWh Battery In Leaf cleantechnica.com
15 points by codex  2 hours ago   9 comments top 5
pkulak 28 minutes ago 1 reply      
This is all much ado about nothing. Nissan put of bunch of cells in a Leaf for one race. It's not like before this they didn't think they could, and just now figured out how to... add more batteries. The 2014 model will not have twice the range for the same price.

Nissan has a 24 kWh battery for a reason: they can sell the car, after incentives, for about the price of a Prius. Doubling the pack would probably add about 10 grand to the price. If Nissan thought that was where they wanted to position themselves in the market, they would have done so already.

Though, personally, a 48 kWh Leaf would be pretty awesome. I'd pay an extra 10 grand for that. It would be a nice middle ground between an 80-grand Tesla and the current Leaf.

mikestew 33 minutes ago 0 replies      
Of course it's not like the bigger pack will be showing up in 2015 Leafs. I'm guessing the test version has no rear seats or trunk space. It also wasn't a test like "test this before it goes into production", the Leaf was modified for a race. A stock Leaf won't make an hour running flat out (electronically limited to 93mph). The bigger pack, as far as I can tell, was installed to make it more competitive.
rdl 1 hour ago 1 reply      
From my experience with 50-100 mile BMW 1e cars, I am pretty sure I'd require at least 200 miles to be comfortable with a car in the Bay Area unless I had a commute pattern with guaranteed charging both at home and at work (and went to the primary work site and remained there most of the day). The "big" Tesla seems like a much safer bet at 250-300 miles.

I guess a 48 KWh Leaf is about the same as the 60 KWh Tesla in range? (it might not be; cheaper/lamer vs. physically larger but more advanced might be a wash).

LoneWolf 1 hour ago 1 reply      
What every article like this and about any other EV fails to mention is how many Km can it do before needing to be charged, how much is needed to charge, etc. I'm sorry but that does not make me want to buy an EV not until I can get easily all the numbers to compare properly.
sliverstorm 1 hour ago 1 reply      
Cool! Aside from price, range is the critical limiting factor in Leaf buying decisions (to hear folks considering one). It doesn't have to get 300 miles to the charge to be a good commuter replacement, but given that environment conditions (e.g. winter, road speed) can decrease the current model's effective range to below fifty miles, even moderate commutes can start pushing your luck. A worst-case range of 100 miles should make it viable for all but the farthest commutes.
Profitless Prosperity avc.com
21 points by _pius  2 hours ago   5 comments top 4
drinkzima 9 minutes ago 0 replies      
I'd love to see an analysis of the risk embedded in the future profits that everyone talks about. What these analyses completely ignore is that Amazon's future cash flows are risky, while the current cashflows at the dying business are not.

There are countless examples of giant companies that haven't realized the enormous future cashflows embedded in their stock prices. It is obvious that Amazon generates lots and lots of revenue, no one disagrees with that. What remains an open question is will they be able to ease off of massive infrastructure investment and actually harness the future profitability that everyone seems certain of at this point.

These straw man arguments that profit-less companies have huge market caps so they will turn hugely profitable ignores the actual argument of whether they can actually achieve that. Can profit does not equal will profit.

7Figures2Commas 1 hour ago 1 reply      
> Amazon is not the only company that is plowing back all of its incremental profits into growing its business. This is very common for enterprise software companies as well. Salesforce has made or lost a small amount of money every year for the past four years but it has grown its revenue from $1.3bn to over $3bn in those four years. And its market value has gone from $12bn to $32bn in the same time frame. Workday hasn't made any profits in the last four years, in fact the net losses have been increasing. But the stock has doubled in the past year and the Company is now worth almost $14bn.

It's amazing to me that folks in the Silicon Valley/VC world continue to look at the performance of publicly-traded tech companies and for some reason never seem to consider that, as deserving of premium valuations as some of these companies may be, much of the crazy price action of the past several years has been driven by the Fed.

You can't seriously look at the charts of companies like Netflix and Tesla and believe that this is the result of DCF analysis.

> If you believe, as Amazon management does, that the future growth is going to be there for Amazon, then you ignore the current P&L and think about what a future P&L might look like.

> If you think that Salesforce and Workday can continue to grow their revenues at or near their current growth rates, then you ignore the current P&L and think about what a future P&L might look like.

How about this: if the Fed balance sheet continues to go up and to the right, ignore fundamentals, pick momentum stocks and think about what their future charts might look like. It doesn't take a lot of imagination: up and to the right.

Of course, when the fun ends, "profitless prosperity" will indeed be profitless for the folks who didn't cash out in time.

nonchalance 39 minutes ago 0 replies      
This all makes sense under QE and other easy money programs (which have a general tendency of inflating P/E multiples) but what happens if/when the programs stop? If 1999-2000 is any indication, those companies could easily see a 90% haircut and still be overvalued under traditional analyses (AMZN P/E is currently 1286.56, for example)
curiouslurker 1 hour ago 0 replies      
How about Twitter, which Fred is an investor in? How sure are the future profits?
Screen and asset image scaling for games coderofworlds.com
39 points by speeder  4 hours ago   14 comments top 7
jblow 1 hour ago 0 replies      
Some time ago, I wrote some articles on image scaling that go into a bit more technical depth (with a specific focus on mipmapping):



A problem I get into at the end of the second article is that gamma-correction is very important for good image scaling results. However, almost nobody gamma corrects during scaling, even today.

zokier 58 minutes ago 1 reply      
One alternative upscaling method (for non-integer ratios) is to first upscale with no interpolation a integer ratio to make a bigger image than the target size. Then the large image is down-scaled (with some suitable algorithm/filtering/interpolation) to final size. Afaik it is what Apple does with it's "Retina" MacBooks and thus I have dubbed it "retina scaling", but there is probably a proper name for the method too.

edit: comparison http://imgur.com/a/fC8iQ#1

city41 52 minutes ago 1 reply      
When offering upscaling options, it's great to offer all the things mentioned in the article. But please always offer nearest neighbor as an option too. I want to see the original pixels as close to their original form as possible, and so do many gamers. Pixel art is great, the "blockiness" is part of its charm.
vinkelhake 3 hours ago 1 reply      
While not really general image scaling, there's a lot of work being done on post-processing in emulators to try to emulate CRT displays (like the aperture grille) or even the quirks of NTSC.


cageface 2 hours ago 0 replies      
You run into pretty much exactly the same set of problems in the audio DSP realm whenever you have to deal with audio of different sample rates or with variable sample rate algorithms like tape delay emulations.

Most typically in that domain you also use windowed sinc filters and there's a ton of literature on the tradeoffs of specific window designs, as well as very fast fixed point implementations etc.

It's all pretty interesting stuff and trying to make it run efficiently on modern mobile hardware is a fun challenge.

MProgrammer 2 hours ago 2 replies      
A great way to deal with scaling assets for mobile is to create your art in a vector format to begin with. For Uncle Slam on iOS we did all the artwork in Illustrator and exported to PNG at three sizes: iPhone 1x, iPhone 2x (and iPad), and iPad 2x. The results are sharp images with the right level of detail and no scaling artifacts.

We even took the exact same artwork and made an 8 foot tall banner with it, and it looks great!

steventhedev 3 hours ago 2 replies      
There's an upscaling algorithm he missed: http://research.microsoft.com/en-us/um/people/kopf/pixelart/...
Android isn't freedom, because Google is closed phonearena.com
20 points by caberus  2 hours ago   18 comments top 7
nemothekid 1 minute ago 0 replies      
Whats the difference between Android/AOSP and RedHat/Linux, or nginx.com/nginx.org, or DataStax/Cassandra? I believe the Android platform itself is free, however the many components that Google provides that run on Google's backend aren't, and understandably so.
AndrewDucker 2 hours ago 1 reply      
It may not be 100% freedom, but they allow me to load apps from wherever I like, unlike Apple or Microsoft, and so I will continue to use them.

If there was a popular phone platform that was even more open, while providing the same levels of functionality, I'd be very tempted to give it a go.

caberus 18 minutes ago 0 replies      
one thing i like about Apple, despite being closed, old devices get software support, for example iPhone 4 released in 2010 and get the newest version of iOS.

i bought my Android device 2.5 years ago it has only get 2.3.5 , initially it was 2.3.3.

djillionsmix 12 minutes ago 0 replies      
I don't know how someone types up the 1000th iteration of this article with no acknowledgement of the 999 times people have written it before now or of any of the responses or counterarguments anyone has made in response.
ZeroGravitas 2 hours ago 0 replies      
"In the real world, if you want the best customer support, you're not likely to find it in the Android ecosystem. If you're looking for specific productivity apps like OmniFocus, you're not going to find it on Android. If you're looking for the best integration of Microsoft services, you won't find it on Android. So, how exactly is Android the "ultimate freedom"?"

Who needs RMS when this guy is on the case.

m458l387 49 minutes ago 2 replies      

"Replicant is a fully free Android distribution running on several devices."

torbit 14 minutes ago 0 replies      
I can change the design and layout my my android phone. Much better than the other competitors.
CSS3 Features That You Can Finally Start Using tutorialzine.com
78 points by fvrghl  8 hours ago   49 comments top 12
bbx 3 hours ago 1 reply      
"Finally Start Using"??

If you're waiting for all browsers to implement these CSS3 features correctly, you'll end up never using them.

For example, CSS animations have been around for at least 2 years. Only Firefox and Webkit browsers supported them and you had to rely on vendor-prefixed properties, but animations were exciting enough for me to start experimenting with them as soon as I could.

There is permanent rant towards "fancy" CSS3 features that haven't reached a "W3C standard" status yet. But a website experience doesn't need to be visually consistent across browsers. If you're using a plain hexadecimal color code as a replacement for rgba, it's ok. If your intro is animated in Chrome but not IE, it's ok too. If your last paragraph has a margin that the pseudo-element :last-child should have canceled, it's ok as well.

Front-End developers have waited more than a decade for CSS improvements. You can't wait for a unified browser environment to start implementing them. I'm surprised by this article's tone, as if today, suddenly, everything changed. No way: browser support is a permanent process. Don't wait for that perfect day because it will never come. Just start having fun with CSS3 while providing a decent experience for IE (the main culprit).

lazyjones 5 hours ago 9 replies      
Roughly 4% of our users still use MSIE 6 (some report that they need to due to their restricted work environment). I need a good reason to risk 4% of our revenue for some eyecandy, or a good fallback mechanism (difficult if my layout relies on calc(), for example). HTML/CSS is a sad story ...
DougWebb 2 hours ago 2 replies      
A bunch of those demos don't work on the Android browser. I also happen to know that calc() doesn't work on Safari 6.x, among others.

I've been updating my webapp software to use flexbox-based layouts, and I have to use a fair number of calc() styles to get the layout I want. (The newer css grid layouts would probably work better, but for now flexbox is the best available.) I've found that a combination of old and new flexbox syntax, Modenizr's flexbox detection, and javascript code that detects whether or not calc is available and runs a method to simulate all of my calc() styles when it isn't, is all necessary to give me good browser compatibility. IE7 and IE8 work surprisingly well, Safari 5 on Windows and 5+ on Mac work, Mobile Safari 5 on the original iPad 1 works, and Mobile Safari 6+ on newer iPads and iPhones work. Firefox, Chrome, and IE9+ work great as well, of course.

It's a lot of additional effort, but it's worth it and definitely doable if you need the browser support and you want to use modern techniques.

stephp 12 minutes ago 0 replies      
Pet peeve: Default z-index makes elements farther down in the code display on top of elements higher up in the code. With multiple backgrounds, the first listed has the highest z-index.
oneeyedpigeon 5 hours ago 1 reply      
There's a brief mention in passing, but the lack of good support for CSS columns is really frustrating. I have a requirement on a project that I'm working on for columns, but they just cannot be satisfactorily implemented in CSS alone, in any browser. Apart to resort to JavaScript instead.
welder 2 hours ago 1 reply      
Speaking of cool CSS animations, try this pure-css library:

https://daneden.me/animate/ Animate.css)

pornel 2 hours ago 3 replies      
There are soo many pointless uses of calc() in this example `margin: 0 20px` gives identical result.

In most other cases all authors really want is `box-sizing: border-box` (which works in IE8+).

scotth 5 hours ago 0 replies      
In the article, it's mentioned that there isn't great support for flexbox, which is only half true.

Using a combination of the older and new syntax (display: box and display: flex), you can already do quite a bit.

I've been using autoprefixer (https://github.com/ai/autoprefixer) for a few months now, building simple flexboxes all over the place with the attributes from the new spec, and have only run into one or two little issues that I can generally hack away.

RoboKitten 1 hour ago 0 replies      
Modernizr is a great way to be able to start using some of these features with legacy browsers. I actually have to battle with some of my customers to upgrade to IE8 (and not to set the group policy that forces "compatibility mode").
tomasien 1 hour ago 0 replies      
Calc() is a huge deal. For non-business oriented applications where IE8 may still be in use, calc() is going to be a god damn lifesaver. I replicate the function in javascript all the time, no longer!
mmmbane 4 hours ago 1 reply      
From the article: "The standardization bodies have probably had their reasons, but it doesnt feel at all intuitive to have the CSS width and height of an element affected by its padding and borders."

And not only are height and width affected by padding+borders, they are affected in different ways, respectively. It's just weird. Does anybody have any insight into what the standardization bodies' reasons were?

jebblue 3 hours ago 1 reply      
I run Ubuntu on my PC, it's an i7 8 core machine with nVidia GTX 660 and the latest tested Restricted driver from the repository. That animation drove my CPU aggregate by top to 90% usage. I clicked the Edit button and while the code was showing the CPU activity dropped back to minimal.
The E-Cigarette Industry, Waiting to Exhale nytimes.com
42 points by sschwartz  5 hours ago   97 comments top 11
pvnick 3 hours ago 4 replies      
E-Cigarettes are going to save millions of lives in the coming years. They're harmless, and, if you get a brand that feels like the real thing, easy to transition to. I hope the tobacco industry takes a huge financial hit from the sale of these things because they literally prevent people from dying.

That being said, when I switched to e-cigarettes, it was very easy to switch back to regular cigarettes. That "narrow bridge of familiarity" was easy to cross back over, especially when I ran out of vapor cartridges. Ultimately for me to quit it took stopping cigarettes cold-turkey. Nicotine gum and a transition to regular gum helped a lot. It's been one of the hardest but most rewarding things I've ever done in my life.

rickdale 4 hours ago 1 reply      
I used to run a custom e-cigarette company. We sold it this summer.

My take away was that the federal government eventually will get involved and start regulating the shipping of these items. Unfortunately the stuff people are getting now is mostly made in China and you can't be too certain of what the heck you are getting when you take a big inhale and just taste the apple flavoring or whatever. Who knows what chemicals are used to make them.

Most of the testimonials I received was from people that were able to quit smoking because of my product, but I definitely got some complaints as well. The biggest benefit of the e-cig is that is is little to no smell. Definitely hit them on airplanes before with no problems. It is an interesting market though where it seems like new types of e-cigs are coming out all the time.

Another thing about e-cigs is that people are making them into e-joints. This is very popular especially in medical marijuana states. Before they banned synthetic marijuana, we were developing a synthetic marijuana e-cig. Would have been fun to have around, but ultimately I am glad not to be filling in mass them any more!

bcx 1 hour ago 1 reply      
It is really interesting to look at this article through the lens of the PR agency/strategy that helped place it for NJOY.

1) Quotes from CEO

2) Little anecdote from Chief Marketing Officer

3) Big vision "Vaping" becomes common place.

4) Key differentiators: "building e-cigarettes that look, feel and perform like the real thing" (this is sprinkled throughout)

5) Celebrity endorsement, Big names: Peter Thiel, Brunno Mars

The rest (post first page) is a nice over view of the e-cigarette industry, and some challenges facing NJOY. I wonder how long they were working on a NYT article, of if this was something pretty easy for them to get, given interest around e-cigarettes.

ams6110 1 hour ago 0 replies      
Smokers getting annoyed looks at a Stones concert??? My how times have changed. The last time I saw the Stones in concert the entire venue was so filled with cigarette (and I'm sure other) smoke that when the house lights came up at the end of the show the place looked like a Cheech and Chong scene.
tluyben2 4 hours ago 5 replies      
A lot of smokers I know switched to these things. They 'smoke' (is that the proper word with such an e-cig?) far more than they did before. Some of them use them every second they have nothing todo with one of their hands, which basically means they're puffing most of their waking day. I hope for them these things don't turn out to be as bad as/worse than normal cigs...
cliveowen 4 hours ago 5 replies      
I can't fathom who thought these things were a good idea. I see people "smoking" these things everywhere: on the bus, on the train, at the effing university during classes!

I don't care if it's just steam, I don't want it in my face.I can't believe people don't get this.

danpalmer 3 hours ago 1 reply      
If I smoked, it certainly wouldn't be something I would show off to others readily or be proud of in any way, I'd see it as an embarrassment. Even if these are safer, I still consider them to be anti-social.

I'm not a smoker, but I can't see why e-cigarettes should be anything more than a way to help stop smoking.

qwerta 4 hours ago 3 replies      
I do not understand why some people try to regulate this. My guess is that income from smoking-tax is shrinking.

I never smoked cigarette in my life, but I tried e-cigs as a replacement for coffee. It works, no hassle with tea or coffee preparation, also it is probably healthier. But being labeled as a smoker puts me off.

hershel 2 hours ago 1 reply      
Since people here talk about the difference of e-cigs and cigarettes , i.e. the lack of effects on MAOI, have anybody tried any juice that claims to have that effect on MAOI, for example "aroma ejuice" ?

How were the results for you ? And does anybody know about research or regulation of this types of ejuices ?

And since those same MAOI affecting compounds(probably beta carbolates) found in tobacco ,are found in brewed coffee, kinds of seasoning, grilled foods and other stuff, have anybody noticed a combination of vaping and some food more effective ?

300bps 4 hours ago 5 replies      
I am starting to see people puffing on these stupid e-cigarettes in restaurants that ban cigarette smoking. When you say something to them they invariably try to "educate" you on what an e-cigarette is and how it's different. The problem is that e-cigarettes have not been shown to be safe. From the article:

Most public health officials seem to agree that the levels of toxins in e-cigarettes are far lower than those in traditional cigarettes. But they also say that far too little is known, not just about potentially harmful aspects of particular brands of e-cigarettes, but also about whether there is harm from secondhand vapor. Dr. Glantz of U.C.S.F. says that in the absence of data, indoor smoking bans should also cover e-cigarettes.

The FDA is collecting reports of adverse effects and there are plenty:


I understand why my mother started smoking when she was 16 and then smoked a pack a day for the next 43 years until she died from cancer. Why in the world are people starting to smoke today with everything we know about it?

girvo 4 hours ago 3 replies      
I used a nicotine spray. I have done for a few months now, it works sublingually. I used it (and patches) to quit smoking, and I'm nearly there, only a few weeks left.

What I find interesting though, is that while the spray does indeed get rid of the cravings, it (and the e-cigs I've used) is not the same "feeling" as smoking. This is possibly because of the lack of any MAOIs in the liquid itself.

How many of those using e-cigs here are ex-smokers? How many picked it up because it's a socially "acceptable" drug that you can now take without killing yourself slowly? I find it such a fascinating topic!

The boss, not the workload, causes workplace depression sciencenordic.com
66 points by Libertatea  8 hours ago   18 comments top 6
mjn 4 hours ago 0 replies      
This is of more than theoretical interest in Denmark (where the study was done), because both public and large private employers have a mandatory health-and-safety oversight process, which needs to be guided by some solid facts about what actually contributes to health and safety. The typical setup is that health/safety statistics from various sources (like the health-care system) are collected and cross-referenced with employment, and if a workplace or department is an outlier on any of those (e.g. significantly above-baseline levels of new mental-health visits), the information is presented to a standing committee made up of both management and employee representatives, which is tasked with investigating why this is the case, and coming up with a plan to address the issue.
deSouza 2 hours ago 0 replies      
That work load should have no effect on risk of depression sounds downright whack. In none of the summaries of the three linked articles do I see that part of the claim documented by their data.

Below are the "results" sections from the summaries of the two non-saliva articles among the 3 referenced in the posted "article".

From "A two-year follow-up study of risk of depression according to work-unit measures of psychological demands and decision latitude."


RESULTS: The OR for depression according to psychological demands was 1.07 [95% confidence interval (95% CI) 0.42-2.49] for every unit of change on a 5-point scale. The corresponding OR for decision latitude was 1.85 (95% CI 0.55-6.26). No interactive effects of psychological demands and decision latitude were observed.

CONCLUSION: These findings suggest that low decision latitude may predict depression, but confidence intervals are wide and findings are also compatible with no increased risk.

From "Work-unit measures of organisational justice and risk of depression--a 2-year cohort study."


RESULTS: Working in a work unit with low procedural justice (adjusted ORs of 2.50, 95% CI 1.06 to 5.88) and low relational justice (3.14, 95% CI 1.37 to 7.19) predicted onset of depression.

CONCLUSIONS: Our results indicate that a work environment characterised by low levels of justice is a risk factor for depression.

tareqak 5 hours ago 6 replies      
From the article:

"Surprisingly, the study indicates that a heavy workload has no effect on whether or not employees become depressed.Instead, it is the work environment and the feeling of being treated unfairly by the management that has the greatest effect on an employees mood."

I guess there are many ways to improve the work environment (and many ways not to), but how do you improve fairness. Isn't it too late the moment you recognize unfairness?

kps 1 hour ago 1 reply      
Our results actually show that high cortisol levels are associated with a low risk of developing depression.

High cortisol is however associated with schizophrenia, for both present and fetal conditions.

ObDisclaimer: Dammit, Jim, I'm a programmer, not a doctor.

snoonan 4 hours ago 1 reply      
It's perhaps not safe to generalize the results from a study on Danish public workers. Billions work in deeply unfair, disrespectful management structures. Many are very harsh and dehumanizing by western standards. I would be interested to see a broader study to see how culture impacts results.
NAFV_P 7 hours ago 1 reply      
Even though my last boss was an alcoholic, he wasn't the cause of the stress I had to endure, he and his brother (who worked as the finance manager) merely exacerbated it. They must have been smoking crack not to notice I had to put up with having "Ah fuck off" shouted at me three or four times a day, along with threats of being punched.
The real problems are with the back end of the software marginalrevolution.com
91 points by wwilson  4 hours ago   103 comments top 26
waterside81 3 hours ago 4 replies      
Hyperbole aside ("... an act which would border on criminal negligence if it was done in the private sector and someone was harmed ..." - what does that even mean? So all of us who have shipped buggy software for our customers are borderline criminals?) - this doesn't surprise me in having dealt with the VA. They have legacy upon legacy upon legacy, with all sorts of fun limitations like not being able to have a "\t" in your content because that'll screw up their backend which relies on tab-delimited data. Health care in the US is playing catch up technology wise to almost every industry. And not for lack of technology, but for lack of political will power.

My favourite example of this was trying to deploy an app within the VA that was written in Django. I was told "Python is not on the list of acceptable languages." So we came back to them and said, "Good news everyone, we ported it to Java." Of course, it was just Jython, but that's the sort of stuff you encounter.

Multiply this by the complexity involved in trying to herd all these cats into one backend like healthcare.gov and it was doomed to fail.

jroseattle 3 hours ago 8 replies      
"The front end technology is not the problem here."

Let me fix that statement: "The front end technology is not the worst problem here."

Looking at the resources loaded for the sign-in page, I counted 58 separate Javascript files. Including one that implied by name was minified, which on inspection clearly was not. I didn't bother counting CSS or image resources. I returned to the page two days ago, which indicated it is down for scheduled maintenance. It remains in this state.

CGI obviously borked this project. The government deserves its own special classification of criticism, but poor planning, change management, etc. from the government is no excuse for CGI not building an architecturally sound web site.

The contract was $350 million? Good grief, they overpaid. Nonetheless, if we could go back in time AND assuming we needed to spend this budget, here's what I would have done:

1. We make investments of $15 million in 20 different startups, and tell them to implement the initial phase -- let's say we call it the "minimum viable product" or MVP. Each startup has the same deadline for delivery.

2. On the delivery date, all companies meet with us to review their MVP. We call it a "demo day" and view all 20 demos.

3. Through some set of criteria, we create a short list of five companies from the 20 demos. Those five companies receive an additional $5 million investment, and another delivery deadline.

4. The companies iterate on their MVP and come back for another demo, this time with a deep dive.

5. We pick a winner from those five. The winner gets another $25 million investment and is responsible for any additional work to be completed.

TechStars for government, essentially.

chernevik 3 hours ago 1 reply      
Yes, and the problems are probably still worse than this.

Because integration means integrating _requirements_, leading to determination and priority of requirements. The current organizational structure doesn't seem to have anyone responsible for even coordinating that. But even if there were, they would need terrific knowledge of each agency's internal systems and legal requirements to determine what is and isn't necessary. And enormous authority, meaning both credibility and power to dictate, to get their determinations to stick.

Absent someone looking over the process, each agency will just "require" everything they might need or want. Leaving something out is risky, unless you know a lot about what you are doing and what will happen next and trust your management. Even if they had all those latter characteristics, bureaucracies don't do risk.

We all know how complexity grows exponentially. I bet the requirements document for this thing doesn't exist, and if it did it would be a clusterfuck of epic proportions.

Here is my wild theory: The possibility this could succeed died the day Tom Daschle withdrew his nomination for Secretary of HHS. Not that Daschle himself is special, though he is pretty bright. But he was slated for an unusual joint role, running HHS and a White House appointment running the health care effort. A position like that might have had access to the specialized knowledge to know what needed doing and the Presidential delegation of power to get it done. If IRS says "we must have X" and Daschle KNOWS they don't because a real expert knows they don't, he can get them in line or they can explain the problem to the President's chief of staff.

Here is the wild part. Daschle was canned, inexplicably, over a truly stupid tax issue (didn't declare a car service as income), while others had far more serious issues waived (Geithner lied about CASH income despite instruction to declare it). Why? I speculate, precisely because the role he designed for himself was remarkably powerful, and effectively outside any review because of the complexity and specialization of its task. Wouldn't the President want someone with the power and knowledge to implement his most important policy? Yes, but not someone beyond his control. Politicians are about power. JFK didn't use the legislative skill of Johnson because he feared Johnson would serve Johnson's interest, not Kennedy's. Once Obama and his people realized that Daschle could become effective President, and Obama something of a titular head of state, they shivved him.

It's all speculation. But it is all plausible enough to suggest why government doesn't work. Massively complicated projects like Google work because its people are, by and large, working for a common purpose on tasks that are commonly understood under common accountability. Government and bureaucracy are fundamentally divided in purpose and understanding. The components can be united by power and knowledge, but by its very nature the system resists establishment of such power and knowledge.

JulianMorrison 1 hour ago 1 reply      
The answer is that you can't structure the transaction as a realtime query. You have to structure it as something that's sent and gives you a ticket, and the reply associated with that ticket will come back in its own time.

Stick the processing pipeline in Twitter Storm (which can retry any step until the whole pipeline is done) and structure the requests as nearly-idempotent (so a repeated reply is harmless, and the first arrival associated with the ticket wins). Finally, you have an "inbox" where people can wait for and see their answer, with optional SMS and email notification.

ams6110 3 hours ago 2 replies      
Slow, legacy backend systems are not an intractable problem. You can do things such as copy the data to a faster cache, or you use some kind of queuing system so that queries are processed only as fast as the backend can handle (of course the frontend needs to be able to "check back later" for the results).

This does support the widely held disbelief that this system will be fixed anytime soon. Clearly the management of the project and the design of the architecture are/were fundamentally flawed, and its very unlikely that it can be fixed in 30 days or whatever at this point.

greenyoda 2 hours ago 2 replies      
"There are no easy fixes for the fact that a 30 year old mainframe can not handle thousands of simultaneous queries. And upgrading all the back-end systems is a bigger job than the web site itself. Some of those systems are still there because attempts to upgrade them failed in the past. Too much legacy software, too many other co-reliant systems, etc."

30 year old (1983) mainframes and databases were designed to handle large transaction loads. For example, airline reservation systems and banking systems were built on them.

And upgrading a mainframe (at least an IBM mainframe) to a faster mainframe isn't such a daunting task, since all the code from 30 years ago (or even from the 1960s) is still object-code compatible with the new machines - you can make it run even if you've lost your source code. There's still lots of 30 year old (and older) Cobol code running on mainframes today.

I agree that re-writing the 30 year old software would be hard, but simply getting it to run faster could probably be done just by spending money on the latest mainframes and disk drives. But if nobody ever did a load test on the site, they wouldn't have known that they had to do this. They probably just thought: "Oh, we have to write a web site that talks to a bunch of databases, how hard could that be?" (By the way, they could have written test code to do a load test on those legacy systems without even having a web site running. In retrospect, that's the first thing they should have done, and it would have shown them that their critical path wasn't the user interface.)

bhauer 2 hours ago 1 reply      
I have not followed the development of this news closely, but skimming these updates has been amusing. I do have a couple very basic questions. If these are stupid, I apologize ahead of time.

My understanding from previous coverage is that some of the state exchange sites, such as California's, are performing acceptably. If that is true, do those state sites also connect to and query the same legacy systems as the federal site? If so, why doesn't the federal government simply ask for or take that code? Surely it's been made available to them? If not, are the legal requirements for the states' exchanges somehow different than the federal site? That seems unlikely since my understanding is the federal site is simply standing in for states that elected to not create exchange sites. I don't see why it would be subject to extra requirements.

What am I missing here?

digikata 1 hour ago 0 replies      
"Failure isnt rare for government IT projects its the norm. Over 90% of them fail to deliver on time" Is this really much different than the success rate of startup culture where VC's count themselves successful if 10% of their investments yield a return? The startup environment has the "success rate advantage" that if the venture really isn't getting traction, you can walk away from it, or change directions a do something related, but not your original objective.

Government projects like the healthcare exchange don't have that degree of freedom - if they go down the wrong track, the only choice is put in more resources until it's back on track. Giving up or changing objectives isn't a decision under the control of the project - it's a legislative or budgetary question.

narrator 29 minutes ago 1 reply      
Only supports 200 simultaneous transactions ay? Well just put a big ol' queue on the front of it with a hard thread limit and tell people they'll get an email when it's ready.
dreamdu5t 9 minutes ago 0 replies      
The problem is government contracting. CGI will continue to get contracts.

The problem is a system where if you don't deliver you get paid millions of dollars and still get jobs.

fauigerzigerk 1 hour ago 0 replies      
Focusing on the performance or scalability of these ancient backend systems is beside the point. It's simply not a great idea to connect a significant number of backend systems run by different organizations in one synchronous online transaction. The overall probability of failure may simply be too high, irrespective of any scalability issues.
snowwrestler 1 hour ago 1 reply      
I think the easiest fix at this point is to simply design around the known delay in synchronizing all the 3rd party data calls.

Have people enter their info, then show them a screen that says "your quote will be emailed to you in 24 hours." Then the integration system has 24 hours to retry any failed data pulls, match up all the data, and generate a quote.

critium 3 hours ago 2 replies      
I've worked in and out of the public sector for the last 10 years and unfortunately, this actually _PAR FOR THE COURSE_.

This is not the contractors fault. Its the government. Before I left to work with a startup, I was abhorred by the lack of ownership on the client's side. Everybody is looking to shuffle responsibility, keep the lowest profile, and do the least amount of work.

It doesnt matter who's writing the code, unless they find somebody competent and passionate on the government side, large projects are destined to fail and better left off to be written by the public sector. This is government waste at its best.

I'm neither republican or democrat but just to add, if my rinky dink app I was working on for the Dept. Of Commerce gets shown to the president when its in 'ALPHA' state, there is no way the most informed person in the world didnt know that the site was going to fail from the get-go.

taternuts 3 hours ago 0 replies      
> Amazingly, none of this was tested until a week or two before the rollout, and the tests failed.

This is absolutely incredible.... two weeks?! Dealing with these legacy systems should have been the absolute first thing tested, is it not the most likely point of failure/bottleneck? Someone on the team had to have been screaming about this and ignored, all the while shitting their pants waiting for go live for the whole thing to crumble.

patja 3 hours ago 1 reply      
I'm wondering why states were allowed to build their own systems and opt out of the federal site. From the Washington state site we get passwords emailed in clear text, a failure to even allow people to enter all components of their income (resulting in inflated tax credit decisions), using monthly income figures where annual ones should be used (again, more incorrectly inflated tax credits). In Oregon they say they can't even log in or get through the application. Each of these state-specific sites cost tens of millions, each resulting in their own unique set of defects on launch, to implement a federal program.

The press seems very focused on the obvious availability and performance problems as well as the errors that come up within the sites that prevent someone from completing their application. There are a whole slew of second-order defects that make it appear your application was successful and correct but were based on incorrect calculations, incomplete data, or other bugs that are not obvious to the user at the time they complete the process.

snorkel 3 hours ago 1 reply      
I don't know how much of this is true, but I bet the truth is no less hilarious. It wouldn't surprise me if this system has no concept of usability and offline processing queues. No matter how complex it is to process an application it's common sense to just give the user immediate feedback "Thank you for your order. We'll contact you by email within N days to followup and report your application status." Do these people expect Amazon to process orders in realtime and fling physical goods at their door in minutes? Should buying health coverage be zero conf one click instantaneous?
ape4 3 hours ago 1 reply      
The frontend assumed the backend was fast enough. That's the problem. If the frontend was made to handle really slow responses from the backend it would look different. It would not make people wait while transactions occurred. Or if might have a page that that displayed your progress: in other to do this for you we need to contact 10 databases - here is the progress of each:

    Database One:   [=======----------]    Database Two:   [============-----]    Database Three: [==---------------]

malandrew 2 hours ago 2 replies      
Strangely, based on the title I thought this was going to be about future startup trends. i.e. For the last 6 years or so we've seen a revolution in interface design as a competitive advantage when creating a new startup, but as the low hanging fruit opportunities are used up, as lot of the really meaty opportunities are going to be in software where there is a significant backend component performing a lot of heavy lifting and magic.

I'm not in the least bit surprised to see that a lot of the work and resulting problems with healthcare.gov are on the backend.

I just wish the government realized that we have all these amazing developers over in the Bay Area that can do a better job than the majority of those developers currently writing software for government contracts. I'm shocked no one in government has said to themselves "What do we have to do to make our software problems accessible to the types of engineers working at the Googles and Dropboxes of the world.

seivan 3 hours ago 4 replies      
I don't believe in this anymore

"Everyone outsources large portions of their IT, and they should. Its called specialization and division of labor. If FedExs core competence is not in IT, they should outsource their IT to people who know what they are doing."

These days I believe each department of government that needs an iPhone application would do better to hire an iOS developer full time to maintain and polish the fuck out of it, continually.

pessimizer 1 hour ago 0 replies      
The real problem is with the weird indirection of the US government providing payment to doctors/hospitals/pharma through subsidies, tax breaks, and tax penalties granted to or extracted from citizens that can only be spent at private insurance companies or mitigated by spending at private insurance companies who then, in turn, pay for your healthcare.

The sheer complexity of this rent-seeking indirection makes keeping track of the millions of distinct participant-instances that can play out in hundreds of different ways, involving integrating tens of massive legacy systems with new, flexible business logic (for a law in flux), impractical.

With single-payer, they could have scrapped the vast majority of this complexity.

colomon 3 hours ago 1 reply      
Can anyone verify this info? I was a bit surprised to see it wasn't better sourced than the comments of a previous Marginal Revolution post. (Nothing against MR, but this seems like huge news if true.
vpeters25 3 hours ago 1 reply      
"... for some inexplicable reason the administration decided to make the Center for Medicare and Medicaid services the integration lead for a massive IT project despite the fact that CMS has no experience managing large IT projects"

Time Magazine's "Bitter Pill" article stated Medicare had an IT system that made them more efficient than private health insurance providers. Isn't such a system large enough?

robomartin 2 hours ago 1 reply      
...and the website is not the worst part of the ACA ride we are now on.

Part of me has been ignoring a lot of the chatter around the ACA as potential right wing fabricated drama. Too much noise and bilateral bullshit being thrown about these days.

That was until a few days ago, when I would learn our insurance has both more than doubled in cost and is also scheduled for cancellation. Doubled and cancelled. All as a direct result of the ACA. Brilliant! To say this was shocking is an understatement. Our annual cost will go well past $15K.

There's a tragedy of unintended consequences, side effects and direct effects, being played out in the background that hasn't completely come to the surface yet. We certainly can't be the last family to get news of this kind. That means in the coming months it is likely hundreds of thousands, if not millions, of additional individuals and families are going to receive these dreaded letters. Apparently hundreds of thousands already have. Last week was our turn.

At one point this and other issues will be difficult to ignore. And they will dwarf the IT issues. The website, as much of a disaster as it is, is likely to pale in comparison to all of the other, non IT, issues.

Some of what's happening is related to the incredible disconnect between Washington and technology. All you need to do is listen to some of these folks talk about the website issue to see how little they understand. I heard one senator say something akin to "they just have to re-enter a list of five million codes". In other words, the term "code" to some of these guys means "numbers" and that someone made a data entry error in copying "codes" into the website.

BSS (Balaji Srinivasan) covered some of this in his excellent Startup School talk:


A talk which, he comments, has been mutated into something far different from what he said by the modern equivalent of the "broken telephone" game.


I agree very much with his suggestion that an "exit" is required. Not meaning that we ought to pull-up roots and go, but rather that the tech community ought to almost ignore the dinosaurs and go ahead and evolve a society more aligned to modern realities. In his talk he gives examples of various US cities that have been "exited" to some extent through technologies developed in the free market.

To some extent, it's an Innovator's Dilemma kind of a problem.


The only way to make step changes is to do it well outside of the organization looking after the status quo, because that's all they know and that's all they can focus on.

whistlerbrk 3 hours ago 0 replies      
"Or IBM, which has become little more than an IT service provider to other companies?" Come now, that's absurd. Incredible things happen at IBM research.
microcolonel 3 hours ago 0 replies      
Since when have they even been good enough at this to critique?

They're going to continue to suck royally, as royalty does.

devx 3 hours ago 0 replies      
I just hope that when this whole mess of a project goes online, and is hacked to death, NSA & friends won't be using this an opportunity to tell us that "see, this is why you need to give us bigger funds and spy on everyon - to protect you against those hackers (offensively)!" - even though the whole issue would be the bad programming and security of the system.
'Born-to-die': this device will self-destruct in 60 seconds theguardian.com
26 points by bgtyhn  5 hours ago   5 comments top 4
tokenadult 4 hours ago 0 replies      
The article draws a good distinction between human medical device applications (very heavily regulated, and thus very slow to market) and civil engineering applications (possibly much faster to market): "Like Dickey, Odom feels that worries over the complexity of these born-to-die devices are 'a straw man' as 'the real sweet spot' for the technology are low-powered devices that are built to perform a specific function. However, she does warn that 'anything in the biomedical arena takes time in the order of 10 years for a device. Other applications like sensors to detect temperatures of structures like bridges might take less time.'"

Yes, even the user interface code for an external testing device used by medical doctors has to have a complete code review by the FDA (I know of an example). So products that look rather simple and inherently safe to laymen can take years to get to market by the time all regulatory approvals are obtained. But the article kindly submitted here immediately caught my eye with examples of medical device applications of biodegradable electronic circuits. There should be a lot of private industry uptake of further research and development of this technology, which someday may be part of routine medical practice as you and I visit physicians.

ams6110 3 hours ago 1 reply      
Would such devices really work as post-op infection treatments? Presumably the bacteria might evolve to be more heat-resistant. And there's only so much heat you can apply before you start to kill off healthy tissue.
dawernik 4 hours ago 0 replies      
The applications here could be very interesting as you could accomplish something and simply perish. snapchat for the physical world.
sambeau 4 hours ago 0 replies      
For the lazy commenter: <insert snark about Apple here />
Amazon and the "profitless business model" fallacy eugenewei.com
182 points by steveb  15 hours ago   111 comments top 25
hristov 13 hours ago 4 replies      
There are some issues with this explanation. The main issue is that the rules of accounting have a very good provision to take into account investing into the future. It is called capitalization.

Thus, if a company spends money to build or acquire a new asset, it is called capital spending and it is not subtracted from the profits. Thus, for example, if a company had a million dollars of profit and decided to spend these million dollars on a new fulfillment center, they could spend the money for their fulfillment center and still report a million dollars in profit.

So it is not quite clear-cut to say that Amazon's desire to build fulfillment centers around the world is costing them their profits. Those things should be capitalized and once they are capitalized they should not affect the profits. Amazon did in fact report significant capital spending (as one can see on their cash flow statement).

However, things are not that simple. Sometimes some expenses which are about building for the future and investing into new growth are not capitalized. This is the case because for some expenses the benefits are so uncertain and difficult to quantify that the SEC requires that they are reported as ordinary expenses instead of capital spending. These types of expenses tend to involve R&D and may include certain administrative expenses associated with growth initiatives.

Therefore, many companies that are trying to grow do report lower profits because they have those expenses that are associated with investment into future growth but are not capitalized. This may be the case for amazon. But it is a question to what extent it is the case for amazon. For example, they do capitalize software and website development for new products and websites. So one cannot simply say that they are showing losses because they are spending all the money on making great new products. But then again, they expense software development for existing products. So perhaps the losses are associated with new growth features that are built into existing software.

So all in all it is a big muddle and it is not at all clear whether amazon is an inherently highly profitable company that happens to be investing in the future, or they are wasting money, or their business model is just not that profitable.

aresant 15 hours ago 2 replies      
" I sell a used book on Amazon, it takes a cut of the transaction, I am the one packing and shipping that item to the buyer. "

In true Amazon "dominate all retail by making it accessible to consumers" their relatively new "Fulfillment By Amazon" service drastically simplifies consumer reselling by eliminating the need for the consumer to do the "packing and shipping".

It's an amazing service, and they are getting darn close to the "just ship us a box of your stuff"

I bet that we see that inside of the next five years, there are lots of problems (like what is / is not valuable) but you can see them already working around these issues by only accepting items with modern barcodes, charge small warehousing fees if something sits too long in inventory, etc.


minouye 11 hours ago 3 replies      
If you're not familiar with the "long-term" thinking of Bezos, this anecdote from Brad Stone's recent book on Amazon is particularly interesting:

Bezos wanted AWS to be a utility with discount rates, even if that meant losing money in the short term. Willem van Biljon, who worked with Chris Pinkham on EC2 and stayed for a few months after Pinkham quit in 2006, proposed pricing EC2 instances at fifteen cents an hour, a rate that he believed would allow the company to break even on the service. In an S Team meeting before EC2 launched, Bezos unilaterally revised that to ten cents. You realize you could lose money on that for a long time, van Biljon told him. Great, Bezos said.

Bezos believed his company had a natural advantage in its cost structure and ability to survive in the thin atmosphere of low-margin businesses. Companies like IBM, Microsoft, and Google, he suspected, would hesitate to get into such markets because it would depress their overall profit margins. Bill Miller, the chief investment officer at Legg Mason Capital Management and a major Amazon shareholder, asked Bezos at the time about the profitability prospects for AWS. Bezos predicted they would be good over the long term but said that he didnt want to repeat Steve Jobss mistake of pricing the iPhone in a way that was so fantastically profitable that the smartphone market became a magnet for competition.

saosebastiao 14 hours ago 1 reply      
Out of all the articles I have read on the issue, this one most accurately sums up the views and opinions of the upper half of the organization. Nobody is scared. Nobody is feeling defensive. Nobody thinks the business as a whole is on the wrong path (although there are definitely a few ventures that some feel are in the wrong).

I don't have the most broad corporate employment history, but as far as it extends, I've met tons of people who feel like they could join a competitor to their own employer and win against them within a decade or so. I have never met a single person who worked at Amazon that has felt that way about competing against Amazon. Even if that competitor had the pocketbooks of Wal-Mart. To me, that speaks volumes about a business strategy.

codex 1 hour ago 0 replies      
Bezos has found and hacked a feature of public markets: you can get away with no profits as long as you're growing. Therefore, you can construct a profitless business scheme that reinvests all profits (or doesn't generate any) as long as your sales forever climb. It's the business equivalent of the Ponzi scheme--and if you look at Amazon's revenue, it is a classic exponential curve.

If sales ever plateau and investors force you to generate profits, the plane stalls and the whole thing spirals down, because it's the profit reinvestment which actually drives sales growth, and actual profits attract competitors who have been unable to pull off the profitless-hyper-growth trick. So far that hasn't happened.

Amazon's value is in the entire business and not the sum of its parts, which means that at some point, investors expect to own a profit making enterprise and not a bunch of warehouses. However, that won't happen until sales plateau or Bezos dies. Ironically, at that point the business loses a lot of value, both because growth has stopped and because competitors are about to enter the space, emboldened by Amazon's newly discovered profits. The whole thing is a bit of a sham. Any growth industry (Internet retail) can support only one "no profit rocket," and eventually it comes back to earth when that industry matures and ends the hypergrowth phase.

jaggederest 14 hours ago 1 reply      
I think Amazon is a great example of the kind of company that makes genuine long term fundamental change to the way the world functions. I really wish that more companies had a less quarterly mindset and would pursue things similarly.
WalterBright 14 hours ago 2 replies      
Amazon is the proof that corporations are not all short-term-focused, and the shareholders have amply rewarded Amazon for that with a huge P/E.
krakensden 13 hours ago 1 reply      
It's worth noting that Yglesias actually knows this[1]. His point is that public companies generally aren't allowed by their shareholders to be this ambitious.

Which 100% vindicates Eugenewei's point about tech companies being wary of capital markets.

[1]: http://www.slate.com/blogs/moneybox/2013/10/22/amazon_profit...

ballard 13 hours ago 3 replies      
The author may not fully appreciate the long game Bezos has been uniquely blessed to play: the sooner Bezos can effectively expand what's working, without over-expanding, it's bootstrapping on a massive scale: buying speed without diluting ownership to even more money sooner. It's not deficit spending (until it is), it's reinvesting profit to grow assets that are the body of the money monster. (For Starcraft fans out there: It's like being broke because of focusing on building SCVs.)

On the other side of the gorge of eternal peril: Cash is king, and should not be underestimated. Or those with the war-chests may try to puke all over Bezos' cake by mistaking lack of current reserves for an actual weakness. I'm sure Bezos is fully aware the ridge-line he's walking on. He probably has aces up both sleeves to clobber anyone that tries to make a move.

Long term, I'd say walmart continues to cash in on the greater unwashed that don't know any better for b&m impulse buys while amzn goes after suppliers and logistics, maybe even an Ali Baba and/or Kickstarter to bring in more product pipes.

tks2103 8 hours ago 1 reply      
The writing style and grammar in this post interfered with my comprehension. In the end, I was unable to finish reading it.

Some examples:

"Giant, heavy electronics items that Amazon sometimes ships for free when the shipping cost is clearly non-trivial and cost more than the usual thin margins on such goods are another."

"But if you sell a glass of lemonade for $2 and it only costs you $1 to make it, and you decide business is so great you're going to build a lemonade stand on every street corner in the world so you can eventually afford to move humanity into outer space or buy a newspaper in your spare time, and that requires you to invest all your profits in buying up some lemon fields and timber to set up lemonade franchises on every street corner, that sounds like a many things to me, but it doesn't sound like a charitable organization."

"The vast vast majority of products Amazon sells it makes a profit on."

It should be relatively easy to rephrase most of the language. For example, the last sentence should be worded: "Amazon makes a profit on the vast, vast majority of products it sells."

I think it would be worth it. I can't understand a lot of the post without effort.

swalsh 8 hours ago 0 replies      
I've always thought of Amazon as this last dinosaur of a by gone era. The days where you can have a really big vision, where if you work a spreadsheet a bit here and there you suddenly have massive amounts of profit. We just need to wait for the world to finished being disrupted. If you disagree with the vision, then you "just don't understand". Most of these business failed, but Amazon found just enough profits sitting somewhere that they have managed to keep on living... So they are in this unique position where they are allowed to invest, and grow to unfathomable heights (well theoretically) because its a survivor bias of the investors.
gizbot 14 hours ago 6 replies      
Strangly, this was the business model of cable companies for the longest time. They never turned a profit. When they expanded, they could use the increased income stream to go deeper into debt. The profits and extra capital went into more expansion. Eventually, they ran out of room to expand, and where are they now?

Someday, Amazon will need to face the brutal reality of profit.

sidcool 14 hours ago 0 replies      
I believe in Jeff's long term vision theory. He is even building a $42 million giant clock called the '10,000 Year Clock' atop the Mount Washington in Nevada. This is to portray his long term vision.

(I feel guilty to mention, but this reminds me something of the 1000 Year Reich sorts)

hayksaakian 15 hours ago 2 replies      
Think about it this way. Profits are dollars that leave the company. By sitting on low profits, Amazon turns all its cash towards itself.
brisance 3 hours ago 1 reply      
This is a dangerous narrative that links the founder to the company in the same way that Apple is forever linked to Steve Jobs.
timedoctor 9 hours ago 0 replies      
There is no issue with reinvesting for growth. Businesses that require a lot of capital to grow need to do that and might need to continue operating with lower or minimal profits as they grow.

However at some point it's important to be able to say that they have played out the majority of their growth ambitions and are ready to start optimizing the business for greater profit.

The trouble is that human nature for many CEOs with big egos and the structure of corporations is to want to continue to grow forever. This is a dangerous attitude. For example perhaps Microsoft shareholders would have been much better off if the company was run without ANY ambitions to compete with Google, Apple OR to dominate mobile or tablets or search or any of these areas. Instead if Microsoft was to just focus on Windows and Office and extract as much profits from the business as possible, then return these profits to shareholders, then the shareholders would be free to invest in Apple and Google stock.

The trouble with this is that for an ambitious CEO this might feel like giving up. I don't believe it's giving up. it's called focus. Focusing on what you are really good at (in this case Windows and Office), rather than pretending that you are great at everything.

don_draper 7 hours ago 2 replies      
I hear working at Amazon requires being available on call and working long hours. If this is the future I'm worried.
walshemj 2 hours ago 0 replies      
Its not uncommon for companies to reduce their profit by various stratagems to reduce the tax they pay.

For example Apples massive overseas cash pile that they dont want to repatriate and pay out to the owners of the company

hownottowrite 5 hours ago 0 replies      
Everything you need to know about Jeff's strategy is in this book: http://www.amazon.com/Sam-Walton-Made-In-America/dp/05535628...

Different medium and market, but basically the same overall strategy.

ars 13 hours ago 3 replies      
Would it be so terrible if Amazon just stayed as a break-even company forever?
moca 12 hours ago 0 replies      
By running at zero profit margin, Amazon is essentially growing itself as fast as it can manage, i.e. reinvest every dollar. Its current revenue growth is even faster than Google. That ensures itself as the biggest ecommerce platform for years to come. If it wants more profit, it can certainly do it. I believe Amazon will eventually automate most of its systems, like using robots instead of humans for warehouse, and gain significant profit margin. Chinese company Taobao (like eBay) provided free service for 5 years, and gained dominant market share. Now it is hugely profitable.

On the other hand, Jeff is likely more interested in just growing the business than counting profit dollars.

Semaphor 6 hours ago 0 replies      
I think I might be a in some kind of bubble. That article sounds like absolutely every article I've ever read about Amazon and I don't think I've ever seen any of the posts he said "didn't get Amazon".
KaoruAoiShiho 10 hours ago 4 replies      
Amazon only has around 10 more years before 3d printing starts to kill retail. Beware.
sdepablos 10 hours ago 0 replies      
Great explanation, but I don't think this business model is incompatible with "flipping the switch" partially as Amazon already did. Examples: raising the minimum amount for free delivery from 25 to 35, or removing free delivery from Amazon UK to certain countries like Spain to avoid cannibalizing it's own business in those countries.
devx 9 hours ago 0 replies      
I wish Amazon would stop subsidizing Kindle device buyers, by surcharging everyone else by $2 on ebooks - especially when that money isn't even split with the authors.


A black box in your car? Some see a source of tax revenue latimes.com
7 points by eplanit  2 hours ago   15 comments top 6
scragg 32 minutes ago 0 replies      
Moronic. So to create tax revenue, let's spend tax revenue on a device, data collection, analysis, complex billing, which of course the gov having a great history of being good stewards with our tax dollars, will overpay. All for something that can double as a spying device. yay.

Just raise the gas tax to make up for lost revenue. Sick of hearing "we are addicted to oil" every state of the union and the answer is right there. Solves the road revenue problem as well.

PeterisP 35 minutes ago 7 replies      
Why is raising the gas tax not an option? I mean, almost noone wants to pay more taxes, but if the road fund needs to take money then it's far simpler to raise the existing tax rate rather than implement a new process, collection agency, and monitoring tools.

The rate listed in article (18 cents/gallon) is not a big influence to prevent driving - it's ten times less than the gas excise tax I pay in EU, which comes out to a bit less than $2 per US gallon. Business doesn't stop because of it, and there's extra motivation to reduce the polluting transportation.

cschmidt 10 minutes ago 0 replies      
Oregon is doing exactly this, apparently. The Economist had an interesting article on it recently.


It does seem like raising the gas tax would be an easier option.

dlgeek 29 minutes ago 0 replies      
And we have to use complicated tracking/transmitting devices that force us to trust us that you're not monitoring our movements instead of just having our odometers read annually (perhaps during emissions testing?)...why?
djillionsmix 22 minutes ago 0 replies      
Four or five times a year someone floats this eternally stupid idea for the Exciting New Car Tax!!! and the answer to why the hell people would ever accept this, but not a gas tax increase, is always mumble mumble mumble "hey, look over there!"
kunai 51 minutes ago 0 replies      
The TRON-ripoff logo isn't helping at all.
Interesting facts about when you get hit by Hacker News tsunami matvoz.com
4 points by matvoz  1 hour ago   discuss
AWS instance was scheduled for retirement amazon.com
68 points by themonk  8 hours ago   60 comments top 16
noonespecial 5 hours ago 2 replies      
Remember kids, an EC2 is not a server. It's a process on someone else's server and all of your data is stored in /tmp. Do plan accordingly.
mdellabitta 5 hours ago 2 replies      
They generally send you an advance email. I just had to migrate our Jenkins server a week or two ago because of this. I received something like 15 days notice on that one.

But obviously if there's a hard failure, they aren't always going to be able to give you the amount of time you'd want. Generally speaking, you should have accounted for this situation ahead of time in your engineering plans. Amazon EC2 doesn't have anything like vmotion, it's just a bunch of KVM virts.

If you're using the GUI, the first time you try a shutdown, it will do a normal request, but then if you go back and try it again while the first request is still pending, you should see the option for doing a hard restart. Try that and give it some time. Sometimes it takes an hour or two to get through. Otherwise, Amazon's tech support can help you.

Corrado 5 hours ago 2 replies      
Ok, the key to working with AWS EC2 instances is to remember that they are ephemeral and can disappear at any point in time. If your treating it like a traditional server that you have in a rack you're doing it wrong. Just turn it off and start a new one. You are using a configuration manager (puppet, chef, etc) aren't you?
sudhirj 3 hours ago 1 reply      
I'm working with another team of people who haven't yet tried working with cloud servers, and one of the things they're struggling with the most is that cloud servers need to be thought of as disposable. They can't easily digest the idea that servers can and will go down randomly for no known reason.

I think Amazon needs to put a lot more effort into educating people about the best practices involved here - creating immutable and disposable servers, make it easier (console access) to create availability groups, etc.

apetresc 4 hours ago 0 replies      
Not only do they send you an e-mail about this, they even have an API call for it: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/monitorin...

Anyone who's surprised that this happens has not used EC2 very much. It is this way by design.

RockyMcNuts 5 hours ago 0 replies      
I've gotten one of those emails and thought, OK it's gonna reboot, not a problem for that instance, has no persistent data I care about.

Then it kept running, but there was no way to reboot it from EC2 console or ssh, so that was a bit of a problem, had to get support to do it.

Moral - reboot it yourself at a convenient time.

rpm4321 4 hours ago 3 replies      
This is somewhat unrelated, but what's the general consensus on the security of EC2 for very sensitive computation?

For example, I have a client who has some algorithms and data that are potentially quite valuable. EC2 and other AWS services would be a huge help with their project, but is there any way measures could be taken to ensure that no one - even Amazon employees - can get to their code and data?

Edit: devicenull makes some good points - I guess I had the CIA's $600 million AWS contract in my head when asking my question.

kartikkumar 6 hours ago 3 replies      
I think I'm missing something. Why isn't Amazon sorting this out behind the scenes so that any failing hardware is seamlessly replaced and the user is none the wiser? Am I expecting too much?
tschellenbach 6 hours ago 0 replies      
Usually they send you a nice email about this. Then you have to lookup the instance and hope its a webworker and not our main database :)
keithgabryelski 5 hours ago 0 replies      
To work in AWS's system you must have redundant nodes -- such that any single node can be rebooted without affecting the system as a whole.

Notification that your system is on old hardware that has been deprecated is part of the price of doing business in this cloud system.

As others have noted: yes, it is a little tense (is this my production database or my Continuous Integrations machine) -- The email you get just gives you an aws-id token, so you must look it up.

but, AWS has enough components that help you build resilient systems that, if you've done you job correctly, you shouldn't care about these messages other than the labor of spinning up a replacement.

gyepi 4 hours ago 0 replies      
War story: I was once called in to scale an application that had been running on AWS for 6 or 7 months and was failing due to excessive traffic. Normally a good problem to have, but this turned into a difficult problem because the application stored critical data on an EBS and those are, of course, not sharable. The only solution was to move to increasingly larger instances until the application could be rewritten.Moral: If you are on the "cloud", make sure your application design fits your infrastructure.
chris_wot 6 hours ago 6 replies      
What, you don't get notified?
regularfry 6 hours ago 2 replies      
Interesting that they've gone that way rather than attempt any sort of live migration.
tbarbugli 3 hours ago 0 replies      
Whats the point of this entry ?Are we surprised that hardware fails ?I am the complete opposite of an EC2 fanboy but every time they decided to shut down a machine they had the good taste of sending an email to us.
dabs_return 3 hours ago 0 replies      
Luke@AWS updated your thread. Makes a lot more sense now as a notice would only be sent if it was a scheduled eviction.
Clojure from the ground up aphyr.com
220 points by Adrock  22 hours ago   75 comments top 11
laureny 15 hours ago 4 replies      
> I want to convince you that you can program, that you can do math, that you can design car suspensions and fire suppression systems and spacecraft control software and distributed databases,

I know you are trying to help but you need to realize that the whole section, and this part in particular, is incredibly condescending and guaranteed to piss off any female who might be reading you.

You want to help achieve gender equality in the technical field?

Pretend that the gender of your reader is of no consequence and just write your stuff, period.

nicolethenerd 21 hours ago 1 reply      
The "who is this guide for?" section seems a bit out of place here in an otherwise technical document. While I appreciate the sentiment, studies have shown that girls/women perform worse on tests if they are reminded that they're female right before they take the test - I would imagine that a similar effect might occur when learning Clojure (for any of the discriminated groups mentioned in the welcome message). If someone has managed to find this page, chances are they already know a bit about programming and don't need a welcome message - no?
ninetax 20 hours ago 2 replies      
I have really really been getting into clojure lately. It's a functional language that doesn't feel hard to grasp. But it can be way harder to make sane comparisons between libraries and stuff with a community so small but so active as clojure's.

My kingdom for a decent comparison between NodeJS+ClojureScript vs vanilla Clojure (w/Compjure maybe?) for high performance web applications!

leephillips 21 hours ago 1 reply      
I've been playing around with clojure and clojurescript on my Ubuntu laptop for a while now, but according to this guide I'm not ready to get started because I don't have "javac" installed.
WoodenChair 21 hours ago 3 replies      
The guide appears to assume one is on Mac OS X or Linux without explicitly stating as such (unless I missed it, in which case my bad). That's not good for a world where most people programming on a desktop are on Windows (Mac user myself).
mahyarm 15 hours ago 2 replies      
Maybe you want to try light table. It's a pure gui experience, and is probably easier than setting up lein. Currently free.
sriharis 6 hours ago 0 replies      
This first lesson was quite similar to Clojure in fifteen minutes: http://adambard.com/blog/clojure-in-15-minutes/. I find both pretty good, although this seems more promising. Waiting for more!
shn 14 hours ago 0 replies      
IMHO, It would be better to seperate two distinct things into two seperate articles. Your view in women in computing and intro to clojure.
sr-ix 21 hours ago 0 replies      
I was just asking a friend of mine for some suggestions for starting up with Clojure.

I loved your Jepsen series and you communicate on a level that I can relate to. As such I was thrilled to find your guide at the top of HN.

Keep up the awesome work!

jackhammer2022 19 hours ago 0 replies      
I love the thorough Guide to Programming in Clojure for Beginners: http://blackstag.com/blog.posting?id=5
daurnimator 20 hours ago 1 reply      
Pointers to a good follow up?
Federal Prosecutors, in a Policy Shift, Cite Warrantless Wiretaps as Evidence nytimes.com
271 points by panarky  1 day ago   100 comments top 9
Rogerh91 23 hours ago 5 replies      
Huge deal, as this will most likely set the stage for a Supreme Court ruling that will define this generation's privacy rights.

The liberal justices voted as a bloc together in CLAPPER, DIRECTOR OF NATIONAL INTELLIGENCE, ET AL. v. AMNESTY INTERNATIONAL USA ET AL. to try to challenge the constitutionality of warrantless wiretaps, and I expect much the same from Kagan, Ginsburg, Breyer, and Sotomayor in this case.

Of the conservative justices, Roberts, especially given his tendency to try to hit some home-run majority rulings for his legacy of being a "by-the-rules" arbitrator, and his pronouncement of privacy issues as being the paramount constitutional issue would be most likely to flip with the liberals. With that said, his previous defense and work on behalf of Bork, and his theory of a lack of privacy in the Constitution does leave a bad taste.

Justice Kennedy unfortunately cannot be counted on when it comes to privacy issues. His majority opinion on Skinner v. Railway Labor Executives enumerating that the government could violate the privacy rights of railway workers by subjecting them to drug tests due to a "special needs" exemption where the Fourth Amendment could be ignored if it was deemed to be in the overriding interest of public safety is the basis of the NSA's metadata collection program---see: http://www.nationaljournal.com/nationalsecurity/how-justice-...

He's still the second most likely to flip because Scalia, Alito, and Thomas are basically lost causes. Scalia basically called a general right to privacy in the Constitution rubbish, and it's unlikely either of the three will bend their ideological bent that the "national security agencies" know best.

The votes might be there. It probably hinges on Roberts. But significant positive changes to how the American government deals with privacy issues could happen. Again, the votes might be there, which is better than never discussing the issue at all (or discussing them in dark, dank courtrooms nobody hears about).

Cause for hope goes exponentially up if one of the conservative justices retires and is replaced by a young liberal justice attuned to technology much as Kagan is. If that happens, this likely scenario becomes a most likely scenario.

Wildcard: The Supreme Court actually doesn't know anything or very much at all about technology. They still pass paper briefs among each other instead of email...a strongly written amicus brief in this situation by technology-savvy leaders could well tip the balance.


codex 1 day ago 13 replies      
This is a good out for Obama. He can't cancel the program or he may appear soft on terror, hurting Democratic presidential chances. Furthermore, the underlying surveillance law was passed by Congress (FISA Amendments Act of 2008), and he has a duty to uphold it. However, he doesn't want to continue these programs unless they are constitutional. Letting the Supreme Court review the program either shuts it down or gives it a stamp of legality. Now that derived evidence has been introduced into a criminal court, someone finally has standing to sue. This may be a part of a gradual wind down of the war on terror.
pmorici 14 hours ago 0 replies      
Seems like the other big story here might be that the Solicitor General probably knowingly committed perjury in front of the Supreme Court. It says in the article that he "discovered" this past June that defendants weren't notified and two paragraphs down it talks about how he stated in arguments in front of the Supreme Court a year earlier that defendants facing such evidence would be notified.
Karunamon 1 day ago 5 replies      
I wonder what was the genesis of this policy change? From a legal standpoint, they were untouchable with the whole "parallel construction" thing.

This has the potential to get the whole program killed. Did someone in charge with both the clout and the morality to do the right thing take a risk? Some other reason? This fascinates me.

3825 22 hours ago 1 reply      
> Mr. Muhtorov is accused of planning to travel abroad to join the militants and has pleaded not guilty.

I don't want to distract anyone from the conversation but I don't understand the case. The prosecutors are just accusing him of planning to join militants? No actual firm conspiracy/plans to actually cause any physical harm? No actual target to attack?

What crimes are the prosecutors trying to prove here?

coldcode 1 day ago 1 reply      
Either the US is a nation of laws or a banana republic. You can't have your illegal surveillance and eat a banana too.
tobylane 1 day ago 0 replies      
The last paragraph mentions past convictions based on evidence where this sort of notice should have been given, but it wasn't policy to do so at the time. Is there a group out there known to deal with this sort of thing? Do the ALCU/AI/Supreme Court even have a chance before there's some retroactive fix?

Also, based on their past decisions how would the SC rule?

crb3 14 hours ago 0 replies      
Sounds like the Stasi are sure they've bought themselves a judge.
avty 20 hours ago 0 replies      
The end of freedom
Baretorrent: minimalist open-source Bittorrent client baretorrent.org
84 points by soundsop  15 hours ago   57 comments top 20
jasonkester 9 hours ago 2 replies      
This is an interesting space to watch. Over the years, we've seen the cycle happen half a dozen times, where the dominant Bittorrent client gets too big, bloated, and spywarish so somebody introduces a new one with an intentionally silly name meant to evoke just how tiny, pico, , (or in this case bare) it is and how it will never ever bloat out like that thing it replaced.

Then it starts adding features. Then it starts getting big. Then somebody starts offering enough ad money that maybe the idea of a tiny little banner ad isn't such a bad idea after all. Then a few years go by, somebody discovers a rootkit in the installer for the 300mb version, and gets annoyed enough to once again implement the protocol in 22kb, name it "scrunchyTorrent" and release it.

It's quite fascinating.

fsckin 10 hours ago 6 replies      
I haven't found anything better than using Transmission[0] as a daemon and Transmission Remote GUI[1] for the adding/removing/management.

I can manage the downloads from any internet capable device from basically anywhere.

Adding CouchPotato[2] and a branch of SickBeard[3] to the mix make it brilliantly easy to download just about anything, automatically, without searching for anything other than the specific title that I'm looking for.

[0] http://www.transmissionbt.com/

[1] https://code.google.com/p/transmisson-remote-gui/

[2] https://couchpota.to/

[3] https://github.com/xbianonpi/Sick-Beard-TPB

beefsack 13 hours ago 3 replies      
Undoubtedly many of you know of it already, but Transmission (http://www.transmissionbt.com/) is an amazing, lightweight Bittorrent client which includes a web interface and things like scheduling. It's included by default in many Linux distros.
1337biz 1 hour ago 0 replies      
The problem for me is with new Torrent clients that they must gain approval by private trackers. There is so much ratio cheating going on that unless they are whitelisted it is pointless to get banned by private tracker for using them.

Private trackers are the only reason why I am still using torrents as there are some super specialized small communities around that share otherwise incredibly difficult to get material.

kristopolous 12 hours ago 2 replies      
rtorrent is great too. http://en.wikipedia.org/wiki/RTorrent I put it in a screen session and detach and log out ...
orik 13 hours ago 1 reply      
How does it compare to Deluge? http://deluge-torrent.org/
e12e 11 hours ago 1 reply      
Always nice to have a simple alternative now that uTorrent is dead (well, as the product/app it once was anyway).

Looking briefly at the code, it appears to be c++ (which is fine) -- but also entirely without tests? Or did I miss something?

steeve 4 hours ago 0 replies      
Personally using qBittorrent (on both Mac and Windows), and I'm really satisfied so far.
mariusmg 6 hours ago 1 reply      
While most of us have multi TB drives, it's still a bit "wrong" to see that the "minimalist" clients have +25 MB install kits while utorrent 2.2.1 is still rocking in 300 kb. Just saying...
tuananh 12 hours ago 1 reply      
The UI on OSX is not pretty but decent.

There's problem with less well-known client is that private trackers may not allow them which make it useless for users who use those trackers.

laurent123456 12 hours ago 2 replies      
It doesn't seem to work in OS X 10.8, it starts then closes immediately. Also the choice of wxWIdgets for a new app is quite unfortunate, I think Qt is generally easier to develop with and to make it work on different platforms.

The idea of a minimalist cross-platform and open source Bittorrent client is great though, I really wish there was some good alternative to replace uTorrent.

Kelet 13 hours ago 0 replies      
I've been using Baretorrent for a while now. It's a great torrent client if you don't need a lot of features such as automatic RSS downloading, although technically there is a plugin system if one did want to implement that :). It's worth noting that there is a small bug regarding duplicate torrent entries in the current version that may or may not effect you[1]. It is easily avoidable and soon to be fixed. The author of the software is very responsive on his forums if you have feature requests or encounter any bugs.

[1] http://baretorrent.org/forum/thread.html?id=40

ollybee 10 hours ago 1 reply      
It uses libtorrent which is excellentThere is a list of libtorrent based projects here: http://www.libtorrent.org/projects.html

It's not clear form the site how this is different from any of the other gui based liborrent software such as qtorrent or halite. Does anyone know if this has any unique features?

shomyo 7 hours ago 1 reply      
> minimalist open-source Bittorrent client

Total Installed Size: 40.79 MiB


chj 6 hours ago 0 replies      
You can't call 20MB+ download minimalistic, can you?
Marwy 10 hours ago 1 reply      
Does it have categories/labels? Priorities? There absolutely needs to be a features list on the homepage.
wooptoo 8 hours ago 1 reply      
This is good for Windows. On Mac/Linux we have Transmission (which is quite small, fast and not bloated) and rtorrent for the minimalists.
gcb0 6 hours ago 0 replies      
Minimalist... Yet not a UNIX program

If you see the changelog, feature creep already started.. Sorting, etc. all that could have been piped to a specialized program.

People never learn.

voltagex_ 10 hours ago 0 replies      
Anyone know how this compares to qbittorrent?
alg0rith 10 hours ago 0 replies      
Does it have RSS feeds?
Intro to Pandas data structures gregreda.com
102 points by grej  16 hours ago   16 comments top 8
zissou 8 hours ago 1 reply      
As a long time pandas user, I'd say this is one of the better write-ups I've seen that illustrates the versatility and functions of the Series and DataFrame objects without being too long winded.

Just one thing to point out regarding the final example: read_csv will actually fetch a URL if it's given as input, so there is no need to use urllib2 and StringIO. Instead, you can just do:

from_url = pd.read_csv('http://www.example.com/data.tsv', sep='\t')

gjreda 4 hours ago 0 replies      
If anyone is interested in reading it as one long Notebook, you can use NBViewer: http://nbviewer.ipython.org/urls/raw.github.com/gjreda/gregr...
gknoy 13 hours ago 1 reply      
Thank you for a great writeup! Your examples are excellent, I can see a direct reason why using IPython Notebooks would improve my workflow, AND I am already convinced that pandas could make some of my tasks (e.g. writing Excel reports) easier.
0003 12 hours ago 0 replies      
Nice job. I really find the best way to grasp pandas is through these ipython notebook walkthroughs. I noticed that http://inundata.org/R_talks/meetup/images/splitapply.png has row 'a' as 2.5, but it should be 3.
macarthy12 14 hours ago 3 replies      
Pandas is great, nice write up.

One thing I do have a issue with in pandas is the type conversion on sparse data, i.e. a column with missing values.It's a pity you can convert that to a float for example.

agumonkey 4 hours ago 0 replies      
I love the pragmatism of it, clipboard and excel bridges are major wins in most offices.
RobinL 7 hours ago 2 replies      
Great write up, thanks - wish I'd had this when I got started with pandas because the thing I found tricky was converting the SQL I knew into its pandas equivalent.

One thing I would point out for new users is the .loc and .iloc functions which I think make selecting data more intuitive because they are a bit more explicit.

rebootthebox 12 hours ago 0 replies      
Excellent post, I just started using Pandas last week and this will be helpful.
LearnGitBranching now has lessons on git fetch, push, and pull pcottle.github.io
25 points by xxbondsxx  2 hours ago   9 comments top 3
officialjunk 55 minutes ago 2 replies      
i feel like my flow is always different than these types of tutorials:

  git checkout master  git pull  git checkout -b my-working-branch  git add <files>  git commit -m "some description"  git add <files>  git commit -m "review comments or other changes"  git checkout master  git checkout -b my-working-branch-squashed   git merge --squash my-working-branch  git commit  git checkout master  git pull  git cherry-pick <hash from squashed commit>  git push
this way i can push a single commit that may have been comprised of multiple commits on my working branch. i haven't come across a tutorial that does it this way. am i doing it wrong. this flow works well for me.

joeblau 1 hour ago 1 reply      
I beat the main game and now you guys are adding more levels! Argh, I'm trying to get work done here!
rnbrady 45 minutes ago 0 replies      
This is totally awesome!
Discovering LastPass shared passwords changedmy.name
61 points by zdw  13 hours ago   33 comments top 11
UnoriginalGuy 10 hours ago 4 replies      
Cannot argue with the results. The whole concept is "impossible" to begin with since even if LastPass hid the content for the development bar (on the LastPass UI) they would eventually still have to submit it to the third party site and you could use the Network tab to grab the raw HTTP request right out of the air.

So I am fine with this weakness. It doesn't impact their core product. This feature is still vaguely useful for less technically literate people, but maybe needs some kind of disclaimer.

borisjabes 36 minutes ago 0 replies      
We do this differently at Meldium (YC W13) and reliably provide a way to share passwords without end-users seeing them. The login occurs on the server side and only the session is transferred to the sharee's browser. Thus, the password is truly never shared or sent down over the wire.
STRML 7 hours ago 2 replies      
This points to what is actually quite a large problem with the LastPass vault. A lot of people I know (myself included) keep the password saved on the vault, so that it will offer to AutoFill/AutoLogin when you visit a site you have an account on. It then is set to re-prompt for the master password to actually fill or reveal the site's password.

Unfortunately, passwords are retrievable out of the LastPass vault in exactly the same way as in the article. It is trivial to simply inspect the DOM and pull them out with some basic JS. This is unacceptable IMO and must be fixed; LastPass is barely functional if you don't keep it logged in. But if you do, all it takes is a right click and a few keystrokes to reveal each password.

I feel a lot worse about this product, now.

jtheory 8 hours ago 0 replies      
This sounds like the kind of "security" feature that will show up on a criminal court case in the future -- that's why I don't like it.

Think about how easy it will be for a company to prosecute the "hacker" who was able to circumvent the security of highly-reputed LastPass to do whatever minor thing they did. LastPass uses strong cryptography and blah di blah blah, after all, so this must be a hard-core hacker who needs to be made an example of.

I understand why the feature is useful -- it's a sort-of "honesty lock" that's easy to get off, but it's obvious to the user that they're not supposed to take it off -- but LastPass should change the language around it so that non-technical users understand that regular people, non-experts, can bypass it.

troupe 4 hours ago 0 replies      
I always assumed that the option only determined whether or not they were given the ability to click and view the password in the LastPass UI. The password must necessarily exist on their computer in an unencrypted state in order to fill out a login screen. It seems really strange to me that people would assume there was a way for it to be securely hidden from the user.

Either way, sharing the password assumes that you are giving them the ability to login to your account. If the person you share with wants to give the password to someone else, it doesn't matter if they can see it or not. They can just share the password to their LastPass account. In other words the fact that they can see the password doesn't change anything from a security standpoint.

I suppose the one exception is a situation where you wanted to use the same password for your email and your bank and only wanted them to share access to your email but not let them see the password so they couldn't log in to your bank. This has a lot of security problems even if you aren't using LastPass or sharing your passwords. LastPass does warn you not to use the same password on multiple accounts unless you explicitly turn the warning off.

theboss 5 hours ago 1 reply      
Did any of you actually expect that you could share an account (a username and password) without one party knowing a piece of the information?

Whether through burp or through Dom inspection there is not much possibility to share an account without them reading the password.

The feature of sharing an account is, by definition, insecure.

The best solution, if you must share the account, is to use LastPass and change the password after they use the account and let LastPass remember the new password.

dbuxton 9 hours ago 1 reply      
This is why we have looked at services like https://www.meldium.com/ but unfortunately (because of the way it works) it doesn't work for arbitrary services, only for ones they have done an integration with.
joshuaheard 9 hours ago 1 reply      
I love LastPass and like sharing access to my accounts this way, but I don't even know why they have this option and it's turned on by default. This option is trivial and they should just get rid of it.
TallboyOne 3 hours ago 0 replies      
You also can just use 1password to save the field that was filled in by lastpass, then log into your 1password and copy saved password.
300bps 5 hours ago 1 reply      
I use KeePass with a password vault stored on SkyDrive but I would never use LastPass. The web browser opens up so many attack vectors that I don't trust anything they do.

I know one person that used to use LastPass. He was a coworker that would utilize it on a shared terminal server and he would select the option to remain logged in. I logged in as me on the terminal server, copied his Chrome cookies file to another account and was immediately able to log in as him to LastPass and access every single one of his passwords. He deleted his LastPass account that day.

There are plenty of ways to address this and other inherent security issues with it but I don't see evidence that the majority of non-technical LastPass users are utilizing any of them.

Here's a discussion that I found about the issue I discussed on LastPass' forum:


hussong 9 hours ago 1 reply      
How does this compare to the way that sharing is implemented in PassPack?
Startup School 2013 Videos Now Online thenextweb.com
151 points by talhof8  22 hours ago   32 comments top 12
balajis 21 hours ago 7 replies      
One of the speakers here. I found the online reaction to my talk pretty interesting as a case study in internet telephone. Here's the talk itself:


First party viewers mostly seemed to like it:


CNET gave a second party writeup:


Then third party people started mischaracterizing it:


Finally, the Hill wrote a fourth party account, quoting these third party accounts, and that's what Washington DC saw:


Not everyone got it wrong; I think this account is closer:


But I encourage you to open up those tabs and go through them one by one to see a kind of pinball reflection of the tone of the talk. In microcosm it's an example of the emerging gap between Silicon Valley and DC, and gives a sense of how policy makers can inadvertently form their opinions from echoes of echoes. Doubly ironic and somewhat sad as we can use the internet to make direct connections between people these days. The good thing is that interested parties can see the primary source directly.

kogir 20 hours ago 1 reply      
Oops, I'm not even done submitting them all yet. Office hours are uploading now, and Chase's talk will follow right after.

Sadly, consumer internet upload speeds haven't kept up with video quality. And these are only 720p, down-sampled from the 1080p source material.

ecesena 43 minutes ago 0 replies      
Added to the list that we're curating on theneeds [1].As always, feel free to reach me for missing material.

[1] http://www.theneeds.com/learn/top-content/startupschool

lachyg 21 hours ago 2 replies      
robbiet480 21 hours ago 1 reply      
Damn where is Chase Adam? His was the best talk!
naavinm 14 hours ago 0 replies      
I've made a YouTube playlist of all of the videos. They're in chronological order following the agenda.


hanley 21 hours ago 1 reply      
Still no video of the Office Hours
floetic 11 hours ago 0 replies      
Chase Adam at Startup School 2013: https://news.ycombinator.com/item?id=6621215
joeblau 18 hours ago 0 replies      
Thank you. I've been waiting for this since I missed a few episodes during the day.
rajacombinator 19 hours ago 0 replies      
Woohoo been looking forward to this for those of us who couldn't make it or watch the live stream. (Was busy filling out a YC app if I recall correctly ...)
hipaulshi 21 hours ago 0 replies      
zekenie 20 hours ago 0 replies      
I think the videos are all private
How to be a Programmer: A Short, Comprehensive, and Personal Summary (2002) mines.edu
76 points by sgreen  16 hours ago   40 comments top 9
Jugurtha 6 hours ago 6 replies      
>>But it is really child's play compared to everything else >>that a good programmer must do to make a software system >>that succeeds for both the customer and myriad colleagues >>for whom she is partially responsible.

Why the use of the word "she" ? I see a lot of articles written where the undefined person will be a she. I speak french, and "person" is a feminine name, so you can say about a person "she", but why in English ? Especially when in male dominated jobs like programming. A programmer is likely to be a he than a she, so why try this hard to be politically correct.

drunkpotato 14 hours ago 5 replies      
What is wrong with us in this sick, perverted, twisted, dying and rotting industry? If you encounter these problems, start applying to a new position. Don't deal with management; don't "educate" them. Don't fix the organization you've tricked yourself into joining. Just polish your resume and leave. It seems like we're an industry that defines Stockholm Syndrome.
JackMorgan 6 hours ago 0 replies      
"Programming languages should really be called notations in that learning one is not at all as difficult as learning a natural language."

Brilliant! This whole section on choosing a language is great.

"One tends to think of a large system that has components in three or four languages as a messy hodgepodge; but I argue that such a system is in many cases stronger than a one-language system..."

This part sounds insane until you start working with eventually consistent messaging like:http://www.reactivemanifesto.org

rrjanbiah 5 hours ago 0 replies      
My favorite part is How to Deal with Organizational Chaos http://samizdat.mines.edu/howto/HowToBeAProgrammer.html#id28... especially:

  Engineers have the power to create and sustain.

ninetax 13 hours ago 0 replies      
Woah you weren't kidding about "Comprehensive". Yet it really is short and seems to encapsulate every thing it talks about pretty well.

Thanks a lot for this!

michaelchum 13 hours ago 0 replies      
Interesting educational read, wise experience from a programmer, definitively to read for beginner programmers like me!
shazzdeeds 13 hours ago 1 reply      
The last section on organizational chaos and the magic power is key.
vayarajesh 14 hours ago 0 replies      
Nice read.
saddino 14 hours ago 3 replies      
Here's a shorter version: Are you so inspired by a software product or service that you are driving yourself mad thinking about how it was created? Great, do whatever you can to write your own version. Keep at it. Seriously, keep at it. Are you so obsessed with figuring this out that you are unaware that hours are passing by while you work? Awesome. You have discovered a true passion. Now you don't need to read things titled "How to Be a Programmer" because you will drive yourself to become one innately.

For everyone else, go ahead and try to read things titled "How to Be a Programmer" but don't expect it to actually help you, you know, BECOME one.

Cheaper to rent in Barcelona and commute to London bestburgerinnorthwestlondon.wordpress.com
233 points by mrud  1 day ago   235 comments top 28
cstross 1 day ago 21 replies      
The point of this piece is not that living in Barcelona and commuting to London is sensible -- it's that the North London property market has gone totally batshit insane.

No, seriously: renting a 2 bedroom flat in a not brilliant suburb of London costs around 25,000 a year, or US $40,000. Then you can add council tax (another 2000), water, electricity, and gas bills, and travel. Upshot: the fixed costs of living in London are on the order of US $50,000 per year (two beds) or around $40,000 per year (one bed). Note that I focus on the two bed option because that's the practical minimum for a family unit, or for someone who telecommutes from home. Note also that the average gross income in London is a little under 28,000 per year (before tax).

Upshot: normal people and normal families can't afford to rent in London any more. The only thing propping up these insane prices is the scarcity induced by the current bubble in the foreign investment housing market. The crash, when it comes, is going to be epic.

outside1234 1 day ago 1 reply      
About 5 years ago I did this. I flew from the south of Spain (flying from Jerez on RyanAir) to London once a week for 2 days a week (couch surfing with a friend).

Its a hard lifestyle - by about the 10th of these flights you will be sick of the security hassles (and RyanAir) - but it was way better than living in London full time (no offense).

I did it for 18 months before finally burning out on it and moving to a full time remote position (which paid less but I decided that that was worth the upgrade in lifestyle).

meerita 1 day ago 3 replies      
Here a fellow hacker from Barcelona.

I rented a loft for 6620/year. It's mint condition and it's in the outskirts of Barcelona. I'm 12 min in subway to the plaza Catalonia and in 7 min using the train or 20 mins in bus. To be honest, I would never never again will rent in the centre of the city. It's expensive and all buildings are antique, without the proper commodities.

If you want to come Barcelona, check the outskirts, get a scooter or enjoy the Barcelona transportation system. It's wonderful.

I want to add some more info about living in Barcelona.

The weather is magnifique. It barely rains all the year. You can go mountains withing 2h car travel if you want to enjoy the snow in winter.

Eating can be really cheap IF you go to the supermarket, buy all the meals and cook yourself like I do, I saved 300/month doing this instead eating outside. If you can compile rails, you can be a chef, :). I do buy the meals and stuff for around 90/month. That includes the 40lts of water i buy. Then daily i try to buy meat, fish or vegetables for the week and it cost me no more than 220 month.

I pay 90 euros electricity, 30 gas and 40 water every 2 months. 60 euros for 100mbit fiber connection + phone and mobile and that's all.

keithpeter 1 day ago 1 reply      
UK renting is just idiotic at present all over, although especially bonkers in our capital city.

Remember that 1 bed flats are especially in demand at present as a result of the (in)famous bedroom tax[1]. A single person or a couple are only allowed 1 bedroom if they need to claim housing benefit (unemployed or low-wage, and remember that in London 'low waged' is a pretty high threshold, e.g. teachers, social workers, retail staff, bus drivers &c).

Bear in mind that building 1 bedroom flats has (hitherto) been regarded as a waste of money for housing associations or councils, so that really only commercial lets are available (at usually twice or three times the rent of a HA/council flat with 2 beds), so, ironically, the tax payer will be paying more to move couples out of 2 bed high rise flats in rough areas which are hard to let into expensive private let 1 bed flats. There will be no takers for the high rise flats (unsuitable for children) so they will be mothballed then expensively demolished.

Yes, bonkers, but the UK is run by the Daily Fail and other populist idiots.

[1] http://england.shelter.org.uk/get_advice/housing_benefit_and...

Edit: OK anonymous downvoter, state your reasons

CWIZO 1 day ago 6 replies      
That's a roughly 4 hour commute in each direction: 1h flight, 1h train ride from airport to liverpool street + commute to the airport and waiting. And this are, in my experience, conservative numbers.

So you spend 128 hours per month commuting, to save 387. Not what I would call a bargain.

Oh, what about double taxation? I'm pretty sure you'd be hit by that and that would most likely put you well in the negative.

lsb 1 day ago 4 replies      
This is silly. Just go southwest a few miles.

There is, for instance, a flat steps from the London Overground in South Norwood (http://www.zoopla.co.uk/to-rent/details/30891717) that is going for 400/month (470/month).

(It's unclear why you'd want such a long commute, versus living in far-away green suburbs by the train.)

j2d3 1 day ago 1 reply      
Cheaper, but fairly ridiculous. Stay in Barcelona and telecommute.

Similarly, it would be cheaper for me to rent in Mexico City and commute to my job in Los Angeles. Yes, some global metropolises have lower rents than others.

officemonkey 1 day ago 0 replies      
So he can live in Barcelona and get home at 10 PM and leave for the airport at 5 AM.

Who cares if you live in Barcelona if you're asleep all the time?

yetanotherphd 23 hours ago 0 replies      
I'm not convinced that house prices area actually a big deal, except as an indicator that more housing should be permitted (by zoning laws).

There are two arguments that are typically given.

Firstly, you want to encourage people of different incomes to live together. I don't believe that this is a worthy goal. It's not clear that the benefit to people on low incomes outweighs the loss to their high income neighbors. And the richest 1% always find ways to isolate themselves anyway.

The second argument is that welfare should taken into account the cost of living. I also believe this within reason, but the welfare system already does this in many ways. In fact, London's "one bedroom rule" is a clunky way to do precisely this: it lets people live where they like, but prevents people from purchasing an excessive "quantity" of housing.

merraksh 1 day ago 1 reply      
Or you could live in Birmingham, in a 650/month one-bedroom apartment 10 minutes away from New Street station, where you can take a 70 minute train to London Euston.

Sure, Birmingham is not Barcelona or London, but I'm not sure how you'd enjoy them by living most of your off-work time in a Ryanair flight.

valdiorn 22 hours ago 1 reply      
I just moved to London, less than a month ago.

I decided that I was willing to pay a premium for my <25 minute commute to work (close to Tottenham Court Road). And as long as other people think the same, rent will go up. Pretty standard supply and demand. Everyone works in the center, and nobody likes to waste 2 hours of their day hopping trains and buses.

So this is what you get, take it or leave it, I guess...

tluyben2 10 hours ago 0 replies      
It's not the point of the article, but 4 hours wouldn't be that bad per se. I would hate to be on an airport every day and being with ryan air every day (I rather pay a lot more than fly ryan air personally; I am almost 2 meters in length and quite bulky in width due to food and daily gym; ryan air is cruel punishment, no matter the cost), but when I still worked in an office in the Netherlands (granted, that's over 10 years ago), I would spend 2.5-3 hours at least in traffic jams. At least in a plane you can read a book or do some work. Sitting in a car, usually in the rain/cold, foot on the break ready to move yet another 30 cm I might even consider worse than ryan air...
yeureka 21 hours ago 0 replies      
I left Barcelona in 2007 to come to London and although the cost of living is higher here ( I live in West Hampstead ), salaries and opportunities are also much better.Granted, I spend more money per month in fixed costs than my entire salary in Barcelona, but I still save more than I spend.

Also, from this article it seems that rents have actually fallen, because in 2007 I could not find that kind of accomodation for that price and I was strugling to save any money compared to now.

I guess the housing market collapse in Spain has actually impacted the crazy Barcelona prices of mid 2007.

byoung2 1 day ago 4 replies      
I'm still convinced that we're just a few years away from a time when people don't have to commute to an office just to sit at a computer. That company in London could save some money and so could the employee in Barcelona if telecommuting were an option.
lnanek2 4 hours ago 0 replies      
Now how much does he earn per hour? How much time does he waste on a long commute each of those four days a week?
RafiqM 4 hours ago 0 replies      
Funnily enough, I came to the exact same conclusion last week while visiting London, from Dublin.

30 return flights and it's faster than Barcelona, 1hr flights (and you can show up 45 mins before flight leaves for IE->UK).

Even using Hotel Tonight while I was in London, accommodation was 200+ on a Tuesday night.

sprizzle 16 hours ago 0 replies      
I like this article as a thought experiment, but I think in practicality, it would probably be miserable to fly to-and-fro 4 days a week.

The main cost that was ommitted that would give us an idea whether the commute is worth it is the opportunity cost (http://en.wikipedia.org/wiki/Opportunity_cost). While it'd be difficult to estimate how much the author's time is worth, if we assume that he/she gets paid an hourly wage of W, and it takes H hours to commute to and from London, then the opportunity cost would be something like W x H. If that opportunity cost is greater than the 387 in savings, then it would not be cheaper to commute from an economist's perspective.

gcb0 1 day ago 1 reply      
Had a friend on UCSD that rented at Tijuana and crossed the border everyday to San Diego.
zobzu 23 hours ago 0 replies      
so 1500 GBP is about 2420 USD1 bedroom in the good part of SF, USA cost about 2500/month.

Sure, there's no council tax and utilities are cheaper. Still, its pretty close.

I bet Paris ain't so far from that either, and let's not talk about NYC.

Basically, all of the large tech cities prices are "batshit insane".

The only hope I see, barred 1h30/2H by plane travel time, as the author suggests.. is remote work whenever possible.You can then live 3-4h away from big cities (so you can still get together if needed), and prices are slashed by 10.

Continuous 1 day ago 1 reply      
It's a 4 hour commute!

Why would you live in Central London? You can commute for an hour into Liverpool St and get much cheaper rents.

meerita 21 hours ago 0 replies      
This post is also a great idea to start a HN Barcelona meeting :).
krmmalik 1 day ago 0 replies      
I just moved to North-West London Two weeks ago and can confirm these numbers. It's an interesting write-up and I can see that it's been written for the mathematical demonstration rather than the practicality but it does give food for thought.
daemon13 1 day ago 1 reply      
To Barcelona hackers - which districts are the best to rent a nice modern-built loft/apartment:

- from quality angle?

- from price angle?

Edit: formatting

marban 1 day ago 0 replies      
It might at least get you a HON circle membership if you take Star Alliance flights but daily use of Ryan Air can't be beneficial for your mental stability.
judk 1 day ago 2 replies      
It is well established that prices RyanAir advertises are not prices passengers pay.

Given the option of the hassle and commute, people would prefer to just live in London. Which is the whole point- real estate pricing is efficient in this case.

qwerta 22 hours ago 0 replies      
My friend works for one the City. He is permanently renting hotel room nearby. He says it is actually cheaper compared to normal rent.
farresito 20 hours ago 0 replies      
Just wanted to add that the apartment that the article refers to is located in a really nice place of Barcelona. Definitely not the center, but relatively close to FC Barcelona stadium, and close to the rich part of the city.
alexchamberlain 22 hours ago 0 replies      
East London is where it's at! It is cheaper to buy than rent though!
       cached 27 October 2013 19:02:01 GMT