hacker news with inline top comments    .. more ..    17 Nov 2014 News
home   ask   best   5 years ago   
1
Resigning as a Debian systemd maintainer
52 points by martinml  1 hour ago   2 comments top 2
1
valarauca1 23 minutes ago 0 replies      
2
acd 14 minutes ago 0 replies      
Thanks for your work on Debian and systemd! Sorry to hear that you had to endure flame wars.
2
Why curl defaults to stdout
55 points by akerl_  3 hours ago   32 comments top 11
1
jrochkind1 12 minutes ago 0 replies      
I actually want printing to stdout more often than I want printing to file, it is more often what I need. I guess different people have different use cases.

I will admit that rather than learn the right command to have curl print to file -- when I _do_ want to write to file, I do use wget (and appreciate it's default progress bar; there's probably some way to make curl do that too, but I've never learned it either).

When I want writing to stdout, I reach for curl, which is most of the time. (Also for pretty much any bash script use, I use curl; even if I want write to a file in a bash script, I just use `>` or lookup the curl arg).

It does seem odd that I use two different tools, with mostly entirely different and incompatible option flags -- rather than just learning the flags to make curl write to a file and/or to make wget write to stdout. I can't entirely explain it, but I know I'm not alone in using both, and choosing from the toolbox based on some of their default behaviors even though with the right args they can probably both do all the same things. Heck, in the OP the curl author says they use wget too -- now I'm curious if it's for something that the author knows curl doesn't do, or just something the author knows wget will do more easily!

To me, they're like different tools focused on different use cases, and I usually have a feel for which is the right one for the job. Although it's kind of subtle, and some of my 'feel' may be just habit or superstition! But as an example, recently I needed to download a page and all it's referenced assets (kind of like browsers will do with a GUI; something I only very rarely have needed to do), and I thought "I bet wget has a way to do this easily", and looked at the man page and it did, and I have no idea if curl can do that too but I reached for wget and was not disappointed.

2
viraptor 1 hour ago 2 replies      
I think he may be missing what people mean by "it's easier without an argument". It's not just "only one option" - what I see in reality quite often is: "curl http://...", screen is filled with garbage, ctrl-c, ctrl-c, ctrl-c, damn I'm on a remote host and ssh needs to catch up, ctrl-c, "cur...", actually terminal is broken and I'm writing garbage now, "reset", "wget http://...".

I'm not saying he should change it. But if he thinks it's about typing less... he doesn't seem to realise how his users behave.

3
NickPollard 1 hour ago 1 reply      
I think his argument is valid, and thinking about curl as an analog to cat makes a lot of sense. Pipes are a powerful feature and it's good to support them so nicely.

However, just as curl (in standard usage) is an analog to cat, I feel that wget (in standard usage) is an analog to cp, and whilst I certainly can copy files by doing 'cat a > b', semantically cp makes more sense.

Most of the time if I'm using curl or wget, I want to cp, not cat. I always get confused by curl and not being able to remember the command to just cp the file locally, so I tend to default to wget because it's easier to remember,

4
eddieroger 20 minutes ago 0 replies      
Having known both tools for a long time now, I never realized there was a rivalry between them - I just figured they're each used differently. cURL is everywhere, so it's a good default. I use it when I want to see all of the output of a request - headers, response raw, etc. It's my de facto API testing tool. And before I even read the article, I assumed the answer was "Everything is a pipe". It sucks to have to memorize the flags, but it's worthwhile when you're actually debugging the web.
5
shapeshed 1 hour ago 3 replies      
Do one thing and do it well.

IMHO cURL is the best tool for interacting with HTTP and wget is the best tool for downloading files.

6
mobiplayer 34 minutes ago 0 replies      
We all have some user-bias and in this case it is geared towards seeing Curl as some shell command to download files through HTTP/S.

Luckily, Curl is much more than that and it is a great and powerful tool for people that work with HTTP. The fact that it writes to stdout makes things easier for people like me that are no gurus :) as it just works as I would expect.

When working with customers with dozens of different sites I like to be able to run a tiny script that leverages Curl to get me the HTTP status code from all the sites quickly. If you're migrating some networking bits this is really useful for a first quick check that everything is in place after the migration.

Also, working with HEAD instead of GET (-I) makes everything cleaner for troubleshooting purposes :)

My default set of flags is -LIkv (follow redirects, only headers, accept invalid cert, verbose output). I also use a lot -H to inject headers.

7
agumonkey 11 minutes ago 0 replies      
curl could parse mime type and decide where to push the stream, POC:

    #!/usr/bin/env sh        case $(curl -sLI $1 | grep -i content-type) in        *text*) echo "curl $1"                ;;        *) echo "curl $1 > $(basename $1)"           ;;    esac
https://gist.github.com/agumonkey/b85cef0874822c470cc6

Costs of one round trip though.

8
lsiebert 37 minutes ago 0 replies      
I was recently playing with libcurl (easiest way I know to interact with a rest api in c), and libcurl's default callback for writing data does this too.It takes a file handle, and if no handle is supplied, it defaults to stdout. It's actually really nice as a default... you can use different handles for the headers vs the data, or use a different callback altogether.

I really, really like libcurl's api (or at least the easy api, I didn't play around with the heavy duty multi api for simultaneous stuff). It's very clean and simple.

9
talles 1 hour ago 1 reply      
> people who argue that wget is easier to use because you can type it with your left hand only on a qwerty keyboard

Haha I would never realize that

10
0x0 25 minutes ago 1 reply      
Chrome dev tools have a super useful "Copy as cURL" right-click menu option in the network panel. Makes it very easy to debug HTTP!
11
userbinator 1 hour ago 0 replies      
I think of curl as a somewhat more intelligent version of netcat that doesn't require me to do the protocol communication manually, so outputting to stdout makes great sense.
3
Keeping Secrets
156 points by ghosh  7 hours ago   11 comments top 3
1
nly 3 hours ago 2 replies      
> According to Inman, the uptake of the research communitys cryptographic ideas came at a much slower pace than he had expected. As a result, less foreign traffic ended up being encrypted than the agency had projected, and the consequences for national security were not as dramatic as he had feared. Essentially, Inman recalled, there was no demand for encryption systems outside of governments, even though many high-grade systems eventually became available.

Some things don't change. Despite the fact that the bedrock of basically all noteworthy asymmetric cryptography was laid in just a handful of years 40 years ago, and despite the fact we've had crypto protocols to solve a lot of really compelling problems for decades, the NSA, and government generally, still has little worry about. The market has a way of selecting really lowsy solutions when it comes to privacy. Consider:

- The abysmal state of implementation. Over-engineered, poorly designed, poorly implemented, and poorly deployed. Did I miss the memo for the billions of dollars of investment and meticulous engineering being poured in to the cryptography Space Race? I guess a couple of OpenSSL forks is a start, right?

- Zero adoption of personal digital signatures. Zilch. Nudda. You can't prove authorship of anything, and can be framed for almost anything. None of the logs being made of your activities are seen by you, let alone signed-off by you as authentic.

- The complete lack of good, usable, client authentication. We've known how to do secure password authentication, even in the presence of weak passwords, since before the web existed, yet we have nothing. Google authenticator is the only meaningful contribution to authentication on the web since the 90s (Pretty much all the third-party systems conflate the issue of identification and authorisation)

- Complete centralisation of interpersonal messaging (Email -> Your webmail provider, SMS, Whatsapp, Facebook Messenger etc). It's all unencrypted, logged, and subverted for government or commercial interests.

- Blackboxification of consumer electronics. Yet somehow, despite the urgency in keeping DRM keys secret, essentially the same technology, we don't have usable HSMs in consumer devices like phones yet.

- Extensive surveillance of all our financial activity. Our supermarkets can track our personal shopping habits down to the items we buy week on week, and our banks knows where you like to buy your Sunday lunch. We've known how to achieve cash-equivalent privacy digitally for 20 years. All we have is Bitcoin which, while heading in the right direction on trust, serves some grand libertarian ideal and accomplishes little in terms of privacy or user friendliness. Go read about Digicash, in another life it could have shipped with Windows 95.

- The complete lack of good trust models and, more importantly, the lack of any education or inclination amongst the general public, particularly among the young and technologically comfortable, to question whether we should really be trusting website X, company Y or app Q with our personal data and habits. Social networks have changed attitudes toward sharing our personal life in one generation. My dad considers Facebook statuses bizarre. My grandma still doesn't trust plastic or direct debits, and prefers cash. We're caught in a generation gap where we have no reason to trust many entities, but have so much incentive to risk it anyway.

... clearly the demand for cryptography is still low.

2
eps 4 hours ago 2 replies      
Pardon the longer quote, but I want to comment on this -

  Rather than trying to understand both sides of the   issue and make the right decision, Hellman said   that in the heat of the controversy, he listened to   his ego instead.  It was not until Hellman watched Day After Trinity,  a documentary about the development of the atomic   bomb, that he realized how dangerous his decision-  making process had been. The moment in the film that   troubled him most, he recalled, was when the Manhattan   Project scientists tried to explain why they continued   to work on the bomb after Hitler had been defeated and   the threat of a German atom bomb had disappeared. The   scientists had figured out what they wanted to do and  had then come up with a rationalization for doing it,  rather than figuring out the right thing to do and  doing it whether or not it was what they wanted to do."
This attitude - screw the consequences, let's just scratch my curiosity itch - is extremely common in tech circles. Cryptonomicon did a good job presenting this issue in a highly digested form - that is when the Avi character is setting up a data haven for all the good reasons and the only people that show up for the (funding) presentation are the criminals and rogue government agents. I was messing with anonymous private p2p systems at the time when the book came out and it was frankly a shock to read, because somehow it was an obvious angle that I never considered at all. I was just engineering stuff because it was really interesting, but never did I consider the consequences of actual application. Realizing that there's an ethical component to every technical project was quite an eye-opener and it had profound effect on how was viewing projects ever since. Perhaps it's obvious to some or a non-issue to others, but then perhaps there are those here who can relate...

3
dsymonds 6 hours ago 0 replies      
This is a good read, though it curiously omits mentioning that the NSA's twiddling of DES's S-boxes turned out to have made DES stronger rather than weaker. That seems like a pretty important note in the story.

(http://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.27...)

4
Postgres 9.4 feature highlight: Indexing JSON data with jsonb data type
141 points by ghosh  8 hours ago   26 comments top 7
1
nl 4 minutes ago 0 replies      
Does anyone know where to find the 9.4 update JSON[1] features? I was sure it was coming, but I can't see it in the docs anywhere[2][3]?

[1] http://stackoverflow.com/questions/18209625/how-do-i-modify-...

[2] http://www.postgresql.org/docs/9.4/static/datatype-json.html

[3] http://www.postgresql.org/docs/9.4/static/functions-json.htm...

2
hliyan 6 hours ago 1 reply      
This is terrific. I spent all of September looking at almost every NoSQL DB in town but nothing could handle the loads we were looking at. Then I found the Postres JSON data types but was sad when I saw it was missing the one thing that would make such a data type worthwhile -- indexing. Now we have it!

Edit: or should I say: we will have it soon?

3
pilif 7 hours ago 1 reply      
Now if only we'd finally get that release :-)

Over the last few years, September was always early christmas for me because in September we would be getting the new release. And despite this being a database, their x.y.0 releases were always rock solid (with the exception of 9.2.0 and an issue with IN queries and indexes), so I'm usually upgrading very early.

This year, it looks like the releases is bit late, also caused by some on-disk storage issues for jsonb, so of course I'd like them to spend all the time they need, but I'm still very much looking forward to playing with jsonb as this will provide a nice way of solving some issues I'm having.

I'm not using the beta releases for anything bigger than quickly seeing whether clients still mostly work because updating between beta releases is a PITA due to it requiring a full restore most of the time.

The database I'd like to use jsonb with is 600GB in size and restoring a dump takes 2-6 hours depending on the machine.

4
mau 2 hours ago 2 replies      
Many people here are comparing this feature of Postgres with MongoDB or other document-oriented NoSQL dbs. I think the comparison is just wrong and unfair.

Even if I'm amazed by the performance of Postgres for this particular task (considering also that is a relatively new feature), I don't think performance is the reason why people are using NoSQL dbs. The problem that NoSQL dbs are helping with is scaling. I don't see this as a priority for an RDBMS such as Postgres. Take for instance MongoDB (just because it was named by many of the other comments here, but I guess the same apply to Couch or others): it's relatively simple to deploy a cluster with automatic sharding, replica, failover, etc.. because these are all builtin features.

5
harel 4 hours ago 0 replies      
I'm patiently waiting for this release like I might wait for a good new film or a game. Don't think I was ever that 'excited' (if I can use that word) by a database point release.

I needed a document database/RDBMS hybrid combo for a piece of work I want to do, and was going to default to MongoDB but once I heard of 9.4 I decided to wait and see how it pans out.

6
misiti3780 7 hours ago 1 reply      
Does anyone have a link comparing postgres 9.4 jsonb and mongodb ?
7
therealunreal 5 hours ago 0 replies      
It'd be interesting to see how this compares to hstore on flat data.
5
Unraveling NSA's TURBULENCE Programs
64 points by acqq  5 hours ago   5 comments top 2
1
mike_hearn 3 hours ago 1 reply      
Would love to know more about the "Pairing and Crypt attacks" along with "Cryptovariable management". Probably the pairing here is referring to the pairing between client and server rather than the cryptographic technique of using pairings ... but it seems this hasn't surfaced in any of the other snowden docs. I often wish the journalists working on that story had released more source material.
2
fit2rule 3 hours ago 1 reply      
This is so disturbing. I honestly feel that with every one of these revelations, my interest in the technology world is degraded more and more. The existence of such heinous things as the TAO, and its tendrils, brings on a serious depression. Just who do these people think they are, to defeat our lives so completely, for their own sakes? Despicable.
6
The Programmer's Price: Want to hire a coding superstar? Call the agent
148 points by eroo  8 hours ago   101 comments top 19
1
mallyvai 6 hours ago 4 replies      
I'm the lead engineer and founder of http://OfferLetter.io - We guide engineers (and designers and PMs) by helping them navigate the "last-mile" - that is, the offer selection/negotiation process - in exchange for a fee from the candidates. (We got brief shoutout in the article along with Dave and our friends at HackMatch)

I want to specifically address 1) Sam Altman and 2) Chris Fry's respective points about the problems with regards to models that align with the candidate more directly (like ours):

1) Much respect for Sam, but he's dead wrong with respect to the 'negative selection' problem - yes, good people have no problem finding work, but the key problem is that the opportunity cost remains phenomenally high for suboptimal decision making. We have actually worked with outstanding engineers who are at YC portfolio companies, simply because they wouldn't have known how to get what they're worth otherwise and push for more. And Sam is missing the point entirely with regards to worth - people in the industry are not getting paid based on their merit - not at all. The gender wage gap is perhaps the most stark example of this, but we see it, starkly, along many other demographic slices as well.

2) With respect to Chris Fry's comments - I was actually in Twitter's eng org when Chris raised the internal engineering referral bonus from 2.5k to 10k because the company wasn't getting the volume of quality people it wanted. Chris is a really great guy, but I find his point about "[at] Twitter, you get the best rsums on your desk already [via recruiting department, referrals, etc]" somewhat misleading - there's no way he would have raised the referral bonus as sharply as he did if he really felt that. In fact, there wouldn't have to be an referral bonus structure at all. Twitter is wonderful, and I loved my time there, but even there we weren't getting all the high-quality people we wanted.

[#plug: check out http://OfferLetter.io - we all deserve to get what we're worth]

2
gregjor 4 hours ago 4 replies      
10X client (freelancer) here. I'm the guy living in Thailand mentioned in the article. Let me add to a few of the comments here. I am not writing on behalf of 10X, these are my own opinions.

I would think that people hiring would like potential hires to be unrepresented. An un-represented developer is going to be cheaper. (chrisbennet)

Price is just one factor. If I charge twice as much as someone else but I can solve the customer's problem in one-fourth the time, the customer saves money. Customers are usually focused on solving business problems within their budget and schedule, not just on hourly rates.

It's not about the technology. It never has been. You're number one mission whenever you get a contract is to understand the business and figure out how you can get them making more money. (aantix)

Almost exactly right. It's not always about making more money, though; it may just be to figure out how to make their software do what it is supposed to do. Too many job postings and too many rsums list technologies without addressing business needs or experience. For me, 10X has been good at getting both sides to talk about and describe business requirements and setting clear deliverables and goals.

Boy. As someone who actually runs a contracting + project agency, that looks to be of an approximately similar size as 10x (at least before this was published), this was lifting-cars-painful to read - not just because they have PR and I don't, but because they (Solomon and Blumberg) _are the inefficiencies they are pretending to eliminate_. (scottru)

The New Yorker article was not an exhaustive description of how 10X works or who does what. Most of my interaction with 10X is with Michael Solomon, so to say he isn't adding any value is just not understanding what he does. Everyone at 10X is adding value for me, and the several 10X customers I work for or have worked for have without exception said only good things about 10X Management. In my experience most projects go wrong due to miscommunication and conflicting expectations. 10X, and specifically Michael, are good at heading those things off before they become problems, and working out solutions that are acceptable to both sides.

Yes, 10X has had some great PR. No, they aren't the only good freelancer agency or consulting firm. I've worked for quite a few placement/consulting firms and with many recruiters in my career (35+ years) and for the work I do now and the life I want to live 10X is a great fit. It may not be the right fit for every client (freelancer) or customer, and it may not be the way to go if you want to try to make millions at a startup.

A few years ago I decided to concentrate on stalled projects and broken code, the almost-working or somewhat broken stuff left behind when developers fall out with their customer and stop answering their emails. My customers are mostly smaller businesses and non-profits, without the need or resources for their own IT staff, and without the sex appeal of Facebook or Twitter. They have real business problems to solve, they can't throw everything away and start over, and they aren't qualified to recruit and hire technical staff. I found plenty of this work on my own, but when I decided to travel and freelance remotely I worried about finding customers and easing their fears about hiring someone living overseas. 10X has been a good fit for me -- they bring in plenty of customers, they have clients with every technical skill you can think of when I need help, and they are a real US-based company that can assure customers I will deliver no matter where I happen to be. They have also negotiated better rates and more useful contracts that I was doing on my own.

3
latch 5 hours ago 1 reply      
Many companies claim that hiring is a top priority, but do little more than put up a JD in a couple places. They then complain about shortages.

You need to actively recruit talent, which means understanding what you're looking for. Building nginx modules? Go through github and see who's built an nginx module before. Do a google search for "nginx module development" and see where that leads. Then send out brief but targeted emails to people you sincerely want to join your team.

There's a ton of developers who aren't actively looking, but presented with the opportunity, they'll jump. I don't know a single developer who isn't flattered and intrigued by a sincere cold call for employment from an actual company (they're pretty easy to filter from agencies)

Second, money and location are a big deal. "We can't find developers" really means "We can't find developers who want to work for this salary and/or at this location." I remember not too long ago, our startup had a budget for 10 new developers, which we just couldn't fill. Our CTO and CEO 100% refused to get 5 developers and pay them 2x. So instead we stayed 10 people short, for over a year, while "hiring remains a top priority."

TL;DR - If hiring is important, spend the necessary time on it, don't just pay it lip service.

4
scottru 6 hours ago 1 reply      
Boy. As someone who actually runs a contracting + project agency, that looks to be of an approximately similar size as 10x (at least before this was published), this was lifting-cars-painful to read - not just because they have PR and I don't, but because they (Solomon and Blumberg) _are the inefficiencies they are pretending to eliminate_.

Let's take a few parts of the article:

>>"The three partners have separate roles. Blumberg handles his and Solomons eleven remaining music and entertainment clients, and takes care of back-office matters: Accounting, invoicing, collection, payouts. Everything thats the bane of most peoples existence. Guvench vets new talent. Potential clients have to fill out a questionnaire that one programmer compared to the most complicated dating Web site ever. Then Guvench and Solomon conduct interviews, to screen for communication skills. (I heard one potential client say, during a meeting in Solomons office, We dont want people who just write code and drool.) Guvench also does code reviewstesting Web sites that aspiring clients have built, and reviewing the programs theyve written."

So...--Blumberg isn't working on the business at all;--Solomon's work isn't even described (except "conducting interviews for communication skills").

So there's one person, Guvench, an ex-engineer, who's actually doing the technical vetting - i.e. 100% of the value so far is coming from one guy.

OK, then maybe the others are selling? Nope.

>>"10x technologists are working with a variety of customers: Live Nation, a virtual-reality startup, and an N.B.A. player who has an idea for a social-messaging app. Solomon admitted, however, that this list is somewhat randomit consists mostly of people who found 10x through Google, or whom he or his clients know personally. He has hired a salesman, to pitch 10x to companies."

OK, so you're closing PR-driven leads and your friends in the entertainment business? That's your sales pipeline?

I know a number of agencies with two or three partners running the organization. I don't know a single one of those where there isn't somebody pounding the pavement, hustling, finding clients - and who know the difference between a long-term partner and a sports star with an "idea for a social-messaging app." (We _all_ hear about those.)

The other value they're talking about is in the negotiation process. Hey, I'm totally willing to believe that a many-year entertainment agent is a better negotiator than I am, at least in the first-principles department. But this is not some magic skill in what is generally a well-defined and competitive market, and of course you're better at it when you deeply understand the technology and market, the BATNA for the client, etc. Those of us who actually understand the very small markets that one job description might meet are, in fact, pretty darned good at it too. For that matter, I've never told an engineer that you should work with us because we can get you a better deal than you can get for yourself, and if you're dealing with a client who understands the market (which said NBA player may not), that's pure hokum. (P.S. plenty of people on HN provide that coaching for free all day long.)

It's ok that the author doesn't really understand this market, and so the competitors she mentions aren't really competitors at all - they're all focused on full-time hiring. I guess it's also OK that the New Yorker's fact-checking department didn't discover that there's no such programming language as "THP" - that should be "PHP." (Maybe it's just a typo.)

But to let the reader believe that this approach represents the best this market has to offer - well, I guess that's just really, really great PR. Back to work.

(Added later: I realized I commented on these folks 1.5 years ago at https://news.ycombinator.com/item?id=5527610. I was feeling nicer then? Maybe? It looks like the participant in that HN thread was the partner who's clearly adding value.)

5
ap22213 19 minutes ago 0 replies      
It's about time that high-end coders are getting the prices that they deserve. Many skilled coders aren't skilled in the art of negotiation, and that ends up bringing down the market price for everyone. I've seen far too many highly-talented people getting roped into five-figure salaries, when they'd be more appropriately priced in the seven figures.
6
danso 8 hours ago 11 replies      
I've never been a freelance developer so I don't know what my rate would be...but I would really like to get some insight on what $200+/hr web-development is like. I mean, what are the expectations versus a $50-developer?

For example, I could probably build in 10 hours a customized, nice-looking Twitter bootstrap Rails site with Stripe integration and deploy it onto EC2 and set up Capistrano to integrate with whatever existing Github flow they have...but then when it comes to building the admin...um...developing the admin from a technical standpoint is non-trivial, but developing it in such a way that it is hassle free from the client...How exactly does the developer do that, without extensive consultation time with the client? And what if their in-house developer (let's pretend they have one who is competent) doesn't have a workflow like I imagine a good workflow should be?

In other words, I'm having a hard time imagining what a rockstar developer could singlehandedly create that would be spectacular and would be something that that mortals can use and maintain on their own...but obviously that's why I'm not a $200+/hr freelance developer.

7
forrestthewoods 6 hours ago 4 replies      
Those rates really don't seem all that high. I mean 250 hour * 40 hours/week * 50 weeks year = $500,000. That's kind of a lot. Except then the article itself mentions employees getting a few million a year in stock from Facebook or Google.

I feel like there exists a market for getting start-ups launched on the right foot. Twelve weeks (480 hours) for $1,000,000 with a bonafide rockstar to get your concept not only up and running but well designed to be carried forward. Do you guys think people would pay that? If no, how much do you think people would spend?

If your idea is good and the work is good it would easily be worth it. Of course proving that you can deliver ahead of time is more than a little difficult.

8
r0h1n 7 hours ago 2 replies      
Playing the devil's advocate here, but why is the concept of free-agent developers who (a) command huge premiums, and (b) prefer working on short, intense sprints instead of with one employer, still not here yet?

I'm not saying 10X Management is an example, but why can't individual programming be valued like, say, acting/singing etc.?

9
sort3d 20 minutes ago 0 replies      
Who would hire a firm called 10x (two timers?)
10
skrebbel 7 hours ago 1 reply      
Sounds like 10x Management hired a PR firm.
11
eroo 8 hours ago 1 reply      
>>"Enter the agents. Solomon describes himself as an equalizer. In creative industries, he told me, 'theres always this pattern that the creatives start out at the bottom of the food chain and are exploited.'"

Even recognizing that the current hiring model has major inefficiencies, it's hard to not see this as awfully ironic.

>>"part of our goal is to de-risk freelancing and make it more viable. [...] She also appreciated that they had been vetted for interpersonal skills. At one point, they had to speak directly with the health-care companys New York offices. 'They were good,' she said. 'And it wasnt embarrassing to let them out of their cave.'"

The value proposition of the agent, pushing both technical and personal professionalism of candidates, should be addressable through a reputational system that doesn't take 15% and require ad hoc negotiations. It would, however, have to be complex enough to address how well certain talent is at addressing specific projects. How much of that is a lack of proper metrics and how much is the hiring party's inability to frame their needs?

12
mgkimsal 1 hour ago 0 replies      
I've been predicting this sort of approach would happen for a while now, and glad to see it taking shape. I would like to see it more prevalent, but living in a somewhat less business-focused area, this approach may not trickle down here for several years.
13
mathattack 8 hours ago 2 replies      
Can you really get rockstars like that for $150-$250/hour including the agent's fee?

How are these agents any different than other contracting firms, other than their supposed access to the best?

14
hackdays 5 hours ago 0 replies      
We are embarking on a reverse negotiation model for startups as well. Feel free to signup http://250ksalary.com

There are lots of areas in life where the cost of acquiring a quality product/service is clearly communicated.Salaries haven't been one of them, which might change.

Don't confuse upfront salary negotiations with lack of motivation etc. This just brings more quality candidates to job markets, saves everyone a lot of time and lets you focus on other important parts of hiring process.

15
einrealist 5 hours ago 1 reply      
How is the rockstar being evaluated? I mean, if the rockstar sucked at two clients before, is it still 250/hr for the next client, because he co-authored X? Or is the 100% success rate guaranteed in the contract? It is like reading about a homeopathy product.
16
rajacombinator 1 hour ago 1 reply      
That's some lolworthy PR right there. The "UX designer for Apple's iCloud"??? These headhunters must have no idea how stupid that sounds to anyone who is actually in our industry.
17
_pmf_ 4 hours ago 0 replies      
Introducing a clueless additional middleman rarely solves a problems.
18
ExpiredLink 4 hours ago 3 replies      
The '10x' super star and other myths debunked:

https://leanpub.com/leprechauns/read

19
0800899g 3 hours ago 0 replies      
great comment section i must say
7
6 Dockerfile Tips from the Official Images
21 points by melbo  2 hours ago   discuss
8
Young Brits in Silicon Valley
61 points by kul  6 hours ago   22 comments top 3
1
GolfyMcG 0 minutes ago 0 replies      
> from nobody to millionaire in weeks

This is what's wrong with media today...

2
irremediable 1 hour ago 1 reply      
The London startup scene is interesting. I think it's fair to say startups here are usually less ambitious than in SF or the US in general. But they also tend to be more practical -- there's less pie-in-the-sky thinking.

I've seen firsthand a surprisingly large number of startups that make a profit from day one. And a lot of startups that were funded from nonwealthy founders' savings. My impression is that we do stuff on a smaller scale, but it tends to be more reliable.

3
georgespencer 5 hours ago 2 replies      
Support network in London is great, because people want to see a huge success story. But actually that's not what people need most (IMO). They need mentorship and support from people who have done it before.

We were lucky to share our office with a YC company when we raised our Series A. I must have bugged one of the founders every day for advice on terms and which partners were cool. But now we're all just scaling like crazy and there isn't that shift up the chain towards new mentors who have gone from e.g. 10m p.a. to 100m p.a. revenue. Those people are rare overall, but concentrated in the Valley.

9
Good Game: The Rise of the Professional Cyber Athlete
58 points by austinz  7 hours ago   6 comments top 4
1
keerthiko 2 hours ago 1 reply      
Ugh. The article made the mistake of making it sound like the casters were amazed by the fact that Scarlett is female. This is the sign of someone who's clearly an onlooker and doesn't pay close attention the eSports scene. While there's definitely things such as GamerGate and such making the gaming scene look terribly unsavory for female participants, in truth the eSports community has long since fully welcomed Scarlett as a true gamer, girl or not. I dislike articles forcing this focus on her being a girl and somehow sticking out for it, much like women in tech hate being glorified just for being "a woman in tech." They have real skill and contribution outside of being an outlier for their gender.

If anyone who knows what's going on is ever amazed she won it's because she was actually an underdog in the specific matchup, not because she's a female, let alone for being a "foreigner" (non-Korean). In fact, the amount of amazement for being a foreigner winning was pretty low by the time of the match the article writes about. I remember watching the game mentioned here last thanksgiving, and it coming down to the wire, and the audience didn't care that Scarlett was "a token female" or anything, we were just excited about the awesome match and that Scarlett pulled a miracle win. The casters were no different, all eye-popping was purely about the awesome decision making and creativity leading to an excellent game.

Don't make the gaming community sound more sexist than it already does on its own. Scarlett may have faced difficulties or felt singled out in the past for it, but it certainly wasn't true in the fashion depicted in this article.

2
ahstilde 2 hours ago 1 reply      
McGrath provides an fascinating view into a unique individual. That is, he is using Scarlett to talk about e-sports, not the other way around. And I think this causes him to miss the whole story. Scarlett does not represent e-sports, unfortunately. The fact that she was able to climb onto a global stage, starting as a nobody, is undeniably incredibly. However, with "The Rise of the Professional Cyber Athlete" as its title, I expected the article to be about the rise of e-sports, not of a singular e-sports athlete.

E-sports is on the cusp on exploding, and video-game live-streaming service Twitch.tv plus the increasing availability of the internet at all times (smartphones help) are a large part of the reason why. Dota2 and League of Legends (both MOBAs) lead the forefront when it comes to players and money, but Starcraft II (real-time strategy a la Age of Empires), while declining, is not going anywhere. Additionally, Hearthstone (Blizzard's online card game) has exploded onto the scene in the past year, proving to appeal to casual and competitive gamers alike with its free-to-play model and low learning curve. Rounding out the pack are the fighting game communities (Super Smash Bros, Ultra Street Fighter IV) and first-person shooters (Halo, Call of Duty, Counter-Strike: Global Offensive). Yes, console-based games have a harder time creating a high-level competitive scene, but it isn't impossible. Starcraft 1, Counter-strike: Source, and Super Smash Bros Melee have all been being played competitively since the turn of the century, almost.

I'm rambling at this point...

If anyone has any questions regarding the e-sports scene, from local grassroots tournament organization to being a high-level competitive player, to other Scarlett-esque people, please ask. I'm most familiar with Hearthstone and Super Smash Bros.

3
gabemart 1 hour ago 0 replies      
The Scarlett vs Bomber game is on youtube:

https://www.youtube.com/watch?v=jIygo3bIVmo

"Oh my god, Scarlett is going gas" at 1m30s

4
ozh 58 minutes ago 0 replies      
This article is 10 years late, honestly.
10
Show HN: A Python Spider System with Web UI
117 points by binux  10 hours ago   27 comments top 11
1
adam-_- 2 hours ago 0 replies      
How does this compare to scrapy? Why would I use one over the other, or is either a fine choice?
2
_bitliner 48 minutes ago 2 replies      
I really like the flow/UX. Congratulations! Nice job!

What is the roadmap?

I am really inside scraping, it is one of my daily job. I could consider to integrate it in one of my architectures

3
kidsil 1 hour ago 0 replies      
Thanks for making me feel bad about my python-based aggregation solution :)

https://github.com/AZdv/agricatch

4
meowface 9 hours ago 0 replies      
This looks really nice. The API seems more user-friendly than scrapy's.
5
erikb 1 hour ago 1 reply      
What is a "spider system"? Never heard that term before.
6
OedipusRex 7 hours ago 0 replies      
Can someone explain what this is?
7
mrmondo 8 hours ago 0 replies      
Nice project! I do wish it supported a PostgreSQL backend rather than (or as well as I guess) MySQL.
8
bowlofstew 10 hours ago 0 replies      
That is a nice tool....nice work!
9
Immortalin 9 hours ago 1 reply      
Any plans for a gui based web scraper interface similar to portia?
10
bjblazkowicz 7 hours ago 0 replies      
How's the performance compared to scrapy?
11
zbb 9 hours ago 3 replies      
Take a look at source code. The package hirarchy is not pythonic (use "libs" as top package is not a good idea).
11
The Dutch Village Where Everyone Has Dementia
61 points by prawn  12 hours ago   23 comments top 6
1
mcv 2 hours ago 0 replies      
Inconsistent spelling of names in the article: is it Hogeway or Hogewey? (It's Hogewey. Or De Hogeweyk, which is the name of the fake neighbourhood, whereas Hogewey is the name of the nursing home operating in it.) And the town it's in is called Weesp rather than Wheesp.

Other than that, cool article. It's not far from where I live, but I'd never heard of it. Hard to get in, apparently. I hope they open more places like this before my parents get to the point where they might need it.

2
MrJagil 2 hours ago 6 replies      
I have never delved into the issue, but the model that Japan and other countries have adopted where the young take care of the old, has always seemed to me the most humane and stimulating solution for the elderly. I don't feel it corresponds well with how modern society works, and my impression is that that model is fading in popularity in Japan as well, but I certainly believe there are innumerable positives to extract from that kind of care-taking.

Along with other speculation such as a seafood-heavy diet, it seems to me to be the reason that Japanese people live the longest.

(Nope, not a single thread of evidence in my post, Sorry. A quick google search somewhat agrees though)

3
danmaz74 4 hours ago 2 replies      
Very interesting article.

Nitpicking: Why do they give the cost for this solution "per month", and then for comparison the costs in the US "per day" and "per year"? Did they choose to make it harder to make a comparison??

4
edwinjm 2 hours ago 0 replies      
CNN World published their item on YouTube:

Untold Stories: Dementia Village

https://www.youtube.com/watch?v=LwiOBlyWpko

5
aapje 4 hours ago 1 reply      
If you are interested in more images or detail, the village is called 'Hogeweyk'(meaning: higher area/borough), not Hogewey or Hogeway as the article states.
6
NKCSS 4 hours ago 2 replies      
Wow, this was a great read. I live in The Netherlands and was unaware of this project. Too bad it's one of a kind and only accomodates so little people; I'd love to be cared for in this way should I ever need it.
12
What does the NSA think of academic cryptographers?
200 points by robinhouston  20 hours ago   29 comments top 9
1
voltagex_ 10 hours ago 1 reply      
From the "How to submit an article" section:

N.B. If the following instructions are a mystery to youand your local ADP support is no help, please feel freeto call the CRYPTOLOG editor on 963-3123s.

Send a hard copy accompanied by a diskette (either 3.5"or 5.25") to the editor at P0541 in 2E062, Ops. 1, orsend via e-mail to mebutle@p.nsa.

For maximum efficiency(as far as possible within the limits of your wordprocessor):

do not type your article in capital letters

do not double-space between lines

but do double-space between paragraphs

do not indent for a new paragraph

classify all paragraphs

do not format an HD diskette as DD or vice-versa

label your diskette: identify hardware (operating system:DOS or UNIX), density of medium, and word processor

put your name, organization, building and phone numberon the diskette

CRYPTOLOG is published in FrameMaker on a Sun HPW.

If you do not have access to FrameMaker, ASCIIformat is preferred

2
tptacek 13 hours ago 1 reply      
Interesting to do a where-are-they-now with the names here. Don Beaver, for instance --- the "charismatic preacher" --- is now a Sr. Software Engineer at Apple, after a 4-year stint at Google doing stuff like security for GFS.
3
D_Alex 12 hours ago 1 reply      
In dismissing the "philosophical" research into cryptography, the NSA writer makes the same error of judgement as the business leaders and politicians make in dismissing research into the fundamentals of, say, physics.

The most significant discoveries either come directly from or are built upon foundations of such research.

4
mturmon 14 hours ago 3 replies      
So funny to read, and with smart commentary by Aaronson about the divergence of interests between the NSA and the university crypto community.

I have pitched tech to the NSA before, and it seemed like they were more interested in benchmarking the capabilities of the outside world than in actually adopting the technology we were pitching.

5
IvyMike 12 hours ago 1 reply      
Note that the conference happened in 1992.

Non-government cryptography has come a long way and become a lot more practical in the subsequent 22 years.

6
danieltillett 14 hours ago 2 replies      
The last two comments are interesting. I am sure someone here knows who wrote the NSA report from the information not redacted.
7
barrkel 13 hours ago 2 replies      
FWIW, this is much more readable if the CSS justify rule is disabled.
8
voltagex_ 10 hours ago 0 replies      
Surprisingly, there hasn't been much discussion of Cryptolog previously.

https://news.ycombinator.com/item?id=5407036

9
frozenport 9 hours ago 0 replies      
Do you believe the culture has change din the last 20 years?
13
Reproducible Development Environments with GNU Guix
63 points by rev  8 hours ago   13 comments top 5
1
arianvanp 5 hours ago 1 reply      
Been using nix for a while for bookkeeping my development environments and it's been nice. The config syntax just is a bit unfamiliar so I'd be happy to check out Guix. The problem is that I also use non-free software and I haven't yet figured out how to set up my own Guix repo for that. Anyone got any luck with that?
2
amelius 3 hours ago 1 reply      
That's nice, but why stop at being a package manager? Why not also replace the build tool (make et al.)?

If all the functional machinery is in place, this would be perfect for a build tool.

3
mat_jack1 2 hours ago 2 replies      
What's the benefit of Guix compared to tools like Chef (https://www.getchef.com/) or Puppet(https://puppetlabs.com/)?
4
sly010 1 hour ago 0 replies      
Introducing: "Meta" a package manager for package managers.

To install, just type:

meta install meta

5
zwischenzug 5 hours ago 0 replies      
A similar project, using docker and pexpect for dynamic and programmable - but auditable and deterministically built - images:

http://ianmiell.github.io/shutit/

14
Scrap your MapReduce Introduction to Apache Spark
44 points by Garbage  9 hours ago   1 comment top
1
jnaour 27 minutes ago 0 replies      
Good introduction. Spark is really a project to watch in the data analysis field on distributed architecture. We had performed several benchmarks and Spark keeps its promisses. 2.5x faster comparing to Pig for the same algorithm on the same cluster.

For iterative algorithm with the in-memory possibilities, performances are really good comparing to Hadoop.

The project is still young with several bugs but the documentation is really good and the code is well commented and robust.

15
VP8 and H.264 to both become mandatory for WebRTC
123 points by kinetik  15 hours ago   45 comments top 12
1
shmerl 8 hours ago 4 replies      
I hope Daala will put an end to this mess. But even though Opus is mandatory now, it didn't yet translate into support by Apple and MS for instance for regular music and Web audio. Their historic sickening opposition to open codecs is not easy to dismantle. Apple still doesn't even support FLAC, just because they like to make things messy for everyone.

By the way, what happened with Nokia's attacks on VP8? Were they refuted by Google or they were validated by some courts?

2
rurounijones 11 hours ago 5 replies      
> WebRTC-compatible endpoints will be allowed to do either codec, both, or neither.

Neither?! What? How will that work then?

3
yuhong 12 hours ago 1 reply      
Hopefully this means MS is finally willing to support VP8 and hopefully WebM too.
4
asmicom 9 hours ago 0 replies      
I saw it long coming. We were butting head at the IETF 88 in Vancouver last year, and following the various correspondences on the mailing list, I knew we needed to make a compromise.

Good job!

5
billconan 9 hours ago 1 reply      
I thought this is vp9/h265 era already?
6
brunorsini 9 hours ago 1 reply      
I don't really agree with the author's comment that this is "an unmitigated win for users": if nothing else, hardware products might become more expensive because they will need native encoding/decoding capability for each codec.
7
MichaelGG 11 hours ago 2 replies      
Does this matter? Implementors can and will just do whatever they want for really critical things like this.
8
markjonson 1 hour ago 0 replies      
Seems like a marketing recipe for disaster. http://www.easycabs.com
9
fithisux 2 hours ago 1 reply      
Why not VP9?
10
l33tbro 6 hours ago 0 replies      
I thought the licensing thing is not an issue if you switch to x264 (open source)? Better video also - smaller file size, less artifacts, doesn't desaturate the image.
11
higherpurpose 9 hours ago 1 reply      
Only these two, or can others be used as well - such as Daala?
12
ck2 5 hours ago 1 reply      
The problem is the minimum level for WebRTC is so low, it makes H.264 useless for regular video decoding (like what you'd find on youtube).

So hopefully browsers implement more than the minimum.

16
Mosaics Revealed at Ancient Greek City of Zeugma in Turkey
33 points by diodorus  9 hours ago   8 comments top 3
1
stinos 3 hours ago 3 replies      
Marvellous. This excerpt from the article really says what I initially thought when seeing the pictures: What is really striking about this mosaic is the wonderful and vivid colors used as well as the beauty of the heroes faces
2
Luc 1 hour ago 1 reply      
The (blogspam?) article slyly omits mentioning a date, but I think these are mosaics recovered more than 10 years ago, and currently on display at the Zeugma Mosaic Museum.
3
return0 3 hours ago 0 replies      
This city is apparently now underwater, but is full of wonderful mosaics. This is an interesting documentary about the restoration of mosaics during 2000-2004:

https://www.youtube.com/watch?v=QUJ7PHCNOVs

17
A Boy and His Atom: The World's Smallest Movie
41 points by WestCoastJustin  12 hours ago   4 comments top 4
1
ColinWright 41 minutes ago 0 replies      
Submitted many times, and yet it never gets any discussion. The only comment on all the previous submissions (from a quick, desultory search) is this:

========

How it was made : https://www.youtube.com/watch?v=xA4QWwaweWA

What are the ripples? : https://www.youtube.com/watch?v=bZ6Hv_du2Zo

========

Similarly, it never seemed to get much attention, judging by the severe lack of up-votes. Here are the previous submissions I found (some deleted):

https://news.ycombinator.com/item?id=8258631

https://news.ycombinator.com/item?id=7242743

https://news.ycombinator.com/item?id=5648856

https://news.ycombinator.com/item?id=5645372

https://news.ycombinator.com/item?id=5643518

https://news.ycombinator.com/item?id=5642074

https://news.ycombinator.com/item?id=5637191

2
rasengan 1 hour ago 0 replies      
This is a very interesting movie. The 'making of the movie' was interesting too. I liked the keyboards that they were using in the office.
3
4
kitd 4 hours ago 0 replies      
Cool ... though the physics was disappointingly "classical" ;)
18
Emacs Rocks
328 points by pmoriarty  21 hours ago   102 comments top 18
1
entreprenewb 18 hours ago 2 replies      
I ran across this site the other day when I was looking at overhauling my .emacs file since I'm only an occasional emacs user these days. This guy has a great starter repo of his customized .emacs.d directory that takes care of a lot of the setup work to use all the customizations and modes he's using: https://github.com/magnars/.emacs.d

One note, if you do clone the repo, make sure to use the --recursive option (as the readme instructs) since there are a bunch of other git repositories imported in the repo.

2
aarohmankad 19 hours ago 4 replies      
There's also http://emacs.sexy/
3
thomasfl 2 hours ago 0 replies      
I've been using Magnar's and Christian's excellent collection of emacs settings and packages for some years now, and haven't bothered to adjust the settings in emacs myself evet since.

It's like turning your trusty old emacs into a monster hacker tool. Watching the screencasts is much more entertaining than reading boring documentation.

4
noobermin 19 hours ago 0 replies      
I've been using emacs for years and I didn't know a few of these tricks!

I've been meaning to dig deeper, and things like this site are a great way to whet my appetite.

5
martin1975 17 hours ago 7 replies      
I've never used emacs- currently using XCode to do C++ development. While I find the demonstrations interesting, what would emacs buy me as a C++ developer?
6
weavie 15 hours ago 0 replies      
The annoying thing about customising Emacs is that when following along with tutorials, I keep tripping up against shortcuts that I have reconfigured to work in a different way. Luckily emacs lets you type out the command in full to bypass the short cut key. I can then determine if it is useful enough to remap the command to a different key.
7
rayalez 17 hours ago 0 replies      
Thank you very much for a great link! I am using emacs for almost everything I do, and I've naturally learned a lot about it, and I was planning for a long time to start seriously learning some in-depth things.

I've just recently learned the basics of lisp, this will be the perfect time to find out more about emacs. It is fantastic, and every time I discover some new feature it blows my mind how great and useful it is.

8
craigching 16 hours ago 0 replies      
Another good emacs site is http://www.masteringemacs.org
9
rayshan 18 hours ago 2 replies      
Nice! Is there something similar for vim? For newbies looking to get into one, it'll be nice to see a comparison.
10
agumonkey 11 hours ago 0 replies      
11
cosarara97 19 hours ago 6 replies      
Maybe if there was a way to use it without modifier keys. Even using shift for caps is painful for me.
13
azinman2 10 hours ago 1 reply      
Nice concept. Id like to see this for vim, the one "true" text editor.
14
spiffy-spaceman 17 hours ago 0 replies      
I just found my new emacs logo ;-)
15
GoofballJones 19 hours ago 2 replies      
It's a great OS.

I mean, Emacs is an OS, right? They've certainly crammed enough things into it to make it an OS.

16
covi 20 hours ago 2 replies      
"Mind Exploded" basically does what vi/vim natively does in an inferior way... (2x key strokes!)
17
alttab 19 hours ago 0 replies      
I'll leave this comment here for those of us who still prefer vi. :)
18
na85 20 hours ago 1 reply      
Emacs is pretty decent, yep. It remains my editor of choice. From what I've read over the years it's a paradigm of poor design under the hood however.
19
Japan Falls into Recession
163 points by ssclafani  13 hours ago   173 comments top 14
1
dimitar 11 hours ago 4 replies      
Here is the common ground and controversy between most economists on the situation, I think it is interesting how much common ground there is:

The common ground:

* The government of Japan faces budget constraints; it cannot tax more than a certain amount and that includes seigniorage (taxing using inflation).

* Right now Japan doesn't seem to be immediately close to those constraints since interest rates and inflation are low.

* Lowering taxes, spending more and depreciating the currency will expand the economy, but rates will increase and so will inflation (among with wages).

* Inflation expectations can create actual inflation. It can be generalized that different people will demand higher prices in advance if they can, since they know their costs will rise. The same applies to interest rate and there is a link between them (investors demand higher yields if inflation is expected).

* Default and excessive inflation can be a result of too much expansionary policy (eventually, what is too much is up for debate), but they can destroy the gains and make the economy worse off.

The disagreement (you can see that its actually a spectrum of opinion and there are differences between the details of the policies, but for clarity I've divided them neatly into two camps):

* School A believes expansionary policy will make Japan default because the government will have lost control, since expectations can make interest rates and inflation jump rapidly. They site that the level of Debt to GDP is over 200% as evidence. They say the government should not lose credibility or else.

* School B believes that the expansionary policy is so hard to actually pull off that some expectations of inflation and higher rates are desirable. Since rates stay low and deflation is always around the corner it seems that the government can easily reverse too much expansionary policy, far before a default appears to be likely. Additionally Increased GDP will bring more revenue, decreasing the need to rely on inflation after a certain point. They joke that the "government should credibly promise to be irresponsible" to get out of the bad equilibrium that is the lost decades.

-----

A political compromise appears to have been made by mixing expansionary policy with the decision to increase the sales tax. Since this caused a recession school B feels vindicated - getting to a default and inflation path is really hard. Interest rates and inflation refuse to bulge.

However the lack of progress will add even more to the debt to GDP, perversely aiding school A (even thought some of them might agree that B were right in the previous period). So the end result has been 20 years of the government oscillating between those two positions, without reaching a point where either side can victory (default or significant GDP growth).

2
akg_67 9 hours ago 1 reply      
We have been traveling in Japan for past 4 weeks (another 2 weeks to go). Talking to regular folks, there seems to be divergence between what regular folks thinks need to be done compared to government.

The regular folks want to see Dollar-Yen parity. As Yen drops, things become more expensive and regular folks cut back more on spending. I am not sure why increasing cost of imports doesn't show up as inflation. Things would have been worse if not for drop in oil prices (major import for Japan). These folks don't believe dropping Yen is the solution as majority of investment by Japanese companies is outside Japan and weaker Yen just reduces that investment, in turn less profit flow back in the country.

The expectation of deflation is deeply rooted in regular folks. Everyone is waiting to spend on large items, just not right now. Land prices are falling so no one wants to buy a house now. We were interested in buying a place in Sapporo but everyone told us to wait until we really need to buy. An apartment costing $200-300K rents for $500-700/mo in Sapporo compared to similar place in Seattle renting for $1,000+/mo. One of our friend mentioned that they just sold their house in Sapporo and had to offer healthy discount to the buyer.

I don't have a solution to Japan's problems. But QE, weaker Yen, and raising taxes doesn't appear to be the solution. Rolling back tax increases may be a start in the right direction.

Regular folks seems to have better pulse on the problem than establishments.

3
skwirl 13 hours ago 0 replies      
Here's an article that's not behind a paywall: http://www.bbc.com/news/business-30077122
4
mc32 12 hours ago 13 replies      
Quantitative easing, Yen depreciation...

What they need to do is figure out a way to grow the population locally or through large-scale immigration. I'm afraid this is the fate that awaits the developed world (or countries with low population growth).

Japan is like a canary in the mine of post-industrialism. It'll be interesting to see what they figure out for their society and the lessons they might have for us.

5
shard 12 hours ago 1 reply      
6
efjhewjfnkew 12 hours ago 2 replies      
Here's the thing: you actually need a strong middle class to buy your products. No amount of artificially inflating the value of the stock market through quantitative easing will give you a healthy economy. QE is like putting a fresh coat of paint on a house with a rotten interior: it makes things look pretty, but the structural integrity of the building is almost non-existent.

Employees working themselves to death for little pay is only aggravating the problem, rather than solving it.

7
nichtich 8 hours ago 0 replies      
Before Abe, Japan is in a trap:

1. It has 200% debt-to GDP ratio

2. It has near zero interest rate and negative to zero inflation

3. It has near zero growth rate.

It's important to understand how this trap works: Japan simply can't have meaningful growth. If there's real growth, that will force the interest up, otherwise there will be mass misallocation and high inflation. But given a 200% debt to GDP, the government just can't afford a higher interest rate, as the debt servicing cost will eat up most of the budget. So, assuming growth rate and interest rate is about the same (big if, i know), for just maintaining the status quo (regarding debt burden), for every x% the economy grows, the government has to raise 2x the amount to cover the interest expense. That's how scary it is.

That's why there's this sales tax hike and the consequent gdp dip. While other country with lower debt to gdp ratio can keep stimulating for a long time and only deal with the debt problem after recovery, Japan can't. It has to increase the government revenue relatively early, because it has a much smaller buffer to begin with.

8
techtivist 11 hours ago 2 replies      
Here's the biggest underlying problem. Contrary to what one would expect, Japanese firms are actually flush with money, but due to Abenomics, the weakness of Yen means they are investing more and more abroad, rather than putting the resources in improving its own infrastructure. We all know about Softbank's recent investments in India and South East Asia. And it's not just limited to Tech or even the private sector, even infra firms are investing in troves abroad, especially in South East Asia. In a way the Japanese growth rate doesn't take into account the rent its entities are receiving from abroad. So the short term picture might actually be rosier than the numbers of would suggest.

The bigger problem is actually long term weaknesses that this trend will expose.

Another huge problem is consumer lending. Most Japanese banks are actually pretty stringent when it comes to lending to their own people. So even if low interest rates might encourage consumer borrowing appetite, there's very little supply out there. I think the PM and the central bank needs to address these, even if loosening lending might be contrary to what Japan has done in the past.

9
rrggrr 12 hours ago 2 replies      
Demographics. Fukishima. China. Japan faces enormous challenges. Aging population means increased healthcare costs supported by a (temporarily) declining workforce and decreased tax revenues. Fukishima's costs are incalculable but impact broadly healthcare, manufacturing, farming and fishing. Much Japanese manufacturing has moved to China and adding to that expense is Japan's need to increase its defense spending to provide a small measure of balance in its territorial disputes with its neighbor. A wave of investment and entrepreneurship is required in Japan that I'm not sure the country is capable of fostering at this time. Bearish.
10
leot 6 hours ago 1 reply      
Japan is a country with too many people saving too much money.

So, one proposal: institute a small yearly wealth tax.

Avoids a lot of the problems with inflation-based approaches, and doesn't penalize people nearly as much for having liquid assets.

11
econew99 8 hours ago 2 replies      
I don't understand following -Japan had QE measure in place thus increasing money supply. But then they also increased sales tax from 5% to 8%.

Why take conflicting measures ? How increasing sales tax is going to make consumers and middle class spend more ?

Have Japan's economists not considered it ?

12
retrogradeorbit 9 hours ago 0 replies      
Paul Craig Roberts on Japan, America and QE.

http://www.paulcraigroberts.org/2014/11/14/global-house-card...

13
j_lev 9 hours ago 0 replies      
21st century activism: refusing to spend money in defiance of the government.
14
zkhalique 8 hours ago 0 replies      
Why Japan, WHY!
20
GNU GLOBAL source code tagging system
75 points by pmoriarty  10 hours ago   15 comments top 7
1
pranith 3 minutes ago 0 replies      
And the latest available version on the most recent ubuntu and debian releases is... 5.7.1 from 6 years ago!!

https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=574947

https://bugs.launchpad.net/ubuntu/+source/global/+bug/127502...

2
jboy 8 hours ago 2 replies      
For me, the most useful information was the "Comparison with Similar Tools" table (in the OpenGrok project wiki), which was linked-to from the GNU GLOBAL page:https://github.com/OpenGrok/OpenGrok/wiki/Comparison-with-Si...

As a Vim user, the comparisons to Ctags and cscope were informative.

And the list of supported languages is impressive! Awk, Dos batch, COBOL, C, C++, C#, Erlang, Fortran, Java, JavaScript, Lisp, Lua, Pascal, Perl, PHP, Python, Ruby, Matlab, OCaml, Scheme, Tcl, TeX, Verilog, Vhdl and Vim.

3
surki 7 hours ago 0 replies      
I have been using Global for about 6 years now. Mostly I use it in emacs (though at times I use the CLI version).

I use it in Linux and Windows. I mostly work on C/C++ (Linux kernel, Windows drivers etc) and also have used it for few C# projects (via exuberant ctags backend).

One real advantage I get: I switch platforms between Linux and Windows (the place I work for has both). So I use emacs in both platforms and same gtags customization works out of box in both platforms. This relieves me in learning/using new editor/tagging system for each platform.

FWIW: my dot emacs https://github.com/surki/dotemacs/blob/master/init.org

4
temp2 4 hours ago 1 reply      
For C/C++ code that you can compile I highly recommend rtags.

It uses clang (llvm) to figure out all the cross-refs so doesn't have false positives from fuzzy matching.

rtags - https://github.com/Andersbakken/rtagsvim plugin - https://github.com/lyuts/vim-rtags

5
comex 8 hours ago 1 reply      
But can it resolve method calls (in C++, or really any language) to the correct methods, rather than just listing all methods with the same name implemented for any class?

I doubt it, since this basically requires a C++ compiler and GLOBAL does not seem to have that. Yet that is what, after years now, I'm still looking for for Vim...

Incidentally I remember there was a work in progress code navigation system posted to HN semi-recently (written in Go?), but its name eludes me. Anyone want to point me to it? I'm curious how it's progressed since it was posted.

6
jsond 5 hours ago 0 replies      
I'm all for FOSS tools, and have (and still) use ctags / cscope (mainly in vim), however the best source code referencing tool I've ever used is Understand - https://scitools.com Yes, it's commercial, however it beats anything free that I've found so far.
7
SloopJon 6 hours ago 1 reply      
I use etags in Emacs to navigate our C source code, which is so much more effective than the brute force search I often see people use. I don't have as good a handle on our C++, C#, and Java, so I'll give this a try.
21
Hamms: a misbehaving HTTP server for testing your clients
48 points by nreece  9 hours ago   8 comments top 5
1
blatherard 1 hour ago 0 replies      
A friend of mine wrote bane, which does similar stuff in Ruby: https://github.com/danielwellman/bane/
2
shdon 1 hour ago 0 replies      
This really is a cool one. The different ports makes it very suitable for handling a specific kind of problem. It might also be nice if there was a port that responded with a random selection from the errors (or with a proper response). e.g. "Port 5599: All of the above, at random"
3
adwf 5 hours ago 0 replies      
Having written a web crawler recently, it's surprising just how many misbehaving servers are actually out there.

Without proper error handling, a wide randomly seeded crawl would hit some form of malformed response or bizarre header within 15 minutes at the most. I eventually gave up on trying to parse all the myriad odd behaviours and now just dump them all onto a blacklist and move the crawler on.

4
Ono-Sendai 1 hour ago 1 reply      
On this note, does anyone know about a 'misbehaving' HTTP client for testing servers?
5
chrismorgan 5 hours ago 1 reply      
A thing along these lines that I have come across before is http://pathod.net/. How do they compare?
22
Halliburton buys oilfield rival Baker Hughes for $34B
3 points by oulipian  1 hour ago   discuss
23
Googles NSA alliance: Deals between Silicon Valley and the security state
145 points by doctorshady  19 hours ago   31 comments top 11
1
WestCoastJustin 14 hours ago 3 replies      
Going out on a limb here, as I posted a pretty lengthy comment on Nation State attacks in the past [1], but does this Google/NSA relationship qualify as terrifying? Google finds out it has been targeted by a Nation State, who is actively siphoning off user data and company crown jewels, so they enter into an agreement with the NSA to share information about the attack, and develop technology and methods to prevent future attacks. I would argue that Google must share knowledge of Nation State APTs with the US Government, and other Silicon Valley firms, once they found out the scope included Symantec, Yahoo, and Adobe, Northrop Grumman, etc. NSA/Google/Facebook/Apple/Microsoft/Amazon should be trying to advance countermeasures and leveling up the industry as a whole, because they arguably have some of the most exposed attack surfaces, host most of our data, and have the financial resources and in house expertise to deal with it best. I would be terrified if Google did not have a relationship where it shared intrusions of this scope with the US Government.

[1] https://news.ycombinator.com/item?id=8316430

2
mikegioia 13 hours ago 2 replies      

    The NSA helps the companies find weaknesses in their    products. But it also pays the companies not to fix some    of them. Those weak spots give the agency an entry point    for spying or attacking foreign governments that install    the products in their intelligence agencies, their    militaries, and their critical infrastructure.
That's probably the most frightening thing imaginable, but where is this info coming from? What are the sources on all of these accusations because it feels like a pretty casual mention that all of the hardware/software manufacturers in the US intentionally leave in back doors.

3
acqq 6 hours ago 1 reply      
The article describes two NSA systems almost as something made for Google:

"The cooperative agreement and reference to a tailored solution strongly suggest that Google and the NSA built a device or a technique for monitoring intrusions into the companys networks. That would give the NSA valuable information for its so-called active defense system, which uses a combination of automated sensors and algorithms to detect malware or signs of an imminent attack and take action against them. One system, called Turmoil, detects traffic that might pose a threat. Then, another automated system called Turbine decides whether to allow the traffic to pass or to block it. Turbine can also select from a number of offensive software programs and hacking techniques that a human operator can use to disable the source of the malicious traffic."

But if you followed all the news since Snowden appeared, you'd know that the TURMOIL is simply the NSA's global passive internet (and more!) monitoring system and the TURBINE one cog of the global active "attack on the internet" one.

https://robert.sesek.com/2014/9/unraveling_nsa_s_turbulence_...

"TURMOIL is a high-speed passive collection systems intercept [for] foreign target satellite, microwave, and cable communications as they transit the globe"

"The TURBINE system provides centralized automated command/control of a large network of active implants"

4
thrownaway2424 13 hours ago 0 replies      
I assume the book has a lot better sourcing and footnotes than this excerpt has, because the excerpt has none of either. The only part of this article you need to read is "Its not clear what the NSA and Google built after the China hack. But [thousands of words of speculation and innuendo follow]" It's not clear, as in the author doesn't know.
5
lawnchair_larry 16 hours ago 2 replies      
Once again the meme that China was hacking to find dissidents was a lie for propaganda purposes, and the actual motivation was to see who the USG had wiretap orders on, to see if any of China's own spies had been burned.

But, it makes China sound evil when you tell the public that they are doing it for human rights reasons.

http://www.washingtonpost.com/world/national-security/chines...

6
junto 5 hours ago 1 reply      
It would be ironic if the original attack was actually the NSA pretending to be China as a false flag.

Google then willingly accepts the NSA's 'tailored solution', which was simply a trojan horse to monitor Google assets (I.e. users) from inside the network.

Unlikely but would make a good fictional story nevertheless!

7
cwisecarver 15 hours ago 0 replies      
The two things I got from this story:

- If anyone of us, that didn't work for Google, had cracked into a sever that breached our servers and just looked around, not destroying data. Wouldn't that be illegal?

- The NSA is getting access to software and hardware back doors before the public is made aware so they can try and catch Chinese hackers. Doesn't this also give them the access they would need to route all our traffic to their giant datacenter and mine it? Aren't the 'Chinese' hackers giving them a convenient excuse?

8
drderidder 14 hours ago 0 replies      
The fact that nobody blinks when agencies openly admit doing this for economic interests is the part I find terrifying. Could there be a more blatant admission of the unethical nature of military-industrial complex?
9
justcommenting 10 hours ago 0 replies      
kudos to shane harris for shedding light on our industry's modern story of Gleichschaltung (https://en.wikipedia.org/wiki/Gleichschaltung)

instead of representative democracies regulating, protecting, and/or supporting technology firms and citizens through systems governed by laws and regulations, we're entering an era of opaque and voluntary "partnerships" where all tech companies are equal, but some are more equal than others.

this sort of coordination outside of legal and especially democratic processes has implications for everyone, and should concern us all.

perhaps unsurprisingly, moxie portended these developments in 2010: https://www.youtube.com/watch?v=Uxz7r4E2li8

10
hindsightbias 17 hours ago 0 replies      
Multinationals that give preference to one nation's TLAs have no right to complain when they're penalized for being preferential.
11
notlisted 12 hours ago 1 reply      
Not mentioned in the article, but I believe that around that time Google switched developers 'en masse' to the Mac platform.
24
The Rise of Extreme Daycare
41 points by tokenadult  15 hours ago   39 comments top 5
1
vidarh 3 hours ago 2 replies      
The US looks more and more like a third world country. The descriptions of working situations in that article are horrifying.
2
fwn 3 hours ago 3 replies      
It is a great tragedy that there is the need for such 24h daycare services. - Both for children and parents.

Speaking about parents: I really wonder how they got to this point.Clearly creating two children while in need of two low paid jobs seems like the opposite of a risk averse strategy.

3
veb 4 hours ago 2 replies      
I have so many feelings about this article. I feel so much sadness for the parents who need to use these kind of places, but yet I feel happy that the kids have somewhere to go and that the parent(s) can pay for it.

I still don't like it much. The article's projections just left my head shaking.

Someone needs to tell these programmers/marketers that there's a wee factor called "humanity" that they need to use when creating their algorithms. I wish it was as easy as that. :( Could someone enlighten me around this? I'm not in the US, and I have a hard time even thinking about everything being open 24/7. (New Zealand is where I reside)

Though, the people who run these wee centres are doing some selfless work. Absolutely amazing, and good on them for doing it. It makes me happy there's still people like that around. (I realise not everyone would be like the daycare in the article but I'd like to think so.)

4
josh_fyi 4 hours ago 1 reply      
Notice the 29 hour-a-week job? Healthcare, anyone?
5
parennoob 2 hours ago 1 reply      
If you have a bunch of single people raising kids on their own, something like this is bound to happen.

"Diana and Ivettes mother, Marisol, for instance, is raising the girls on her own, working at a supermarket from 8 a.m. until 2 p.m. and at Home Depot from 6 to 10 p.m., six days a week. "

"This clock has highlighted weakness in our social networks. In 2013, 28 percent of children were living with a single parent; 77 percent of those single parents are mothers."

Sounds like at some point, people are going to have to make a specific choice between raising kids on their own (potentially) or not having them at all; and vote for Governments that favor the policy they prefer.

25
Show HN: SineRider, a game inspired by my TI-86
121 points by SigmaEpsilonChi  16 hours ago   24 comments top 16
1
shalmanese 13 hours ago 0 replies      
On OSX/Chrome/rMBP, I couldn't edit the equations in the unity web player unless I was in full screen mode and then, when I entered full screen, exiting it caused my computer to enter a locked state which only a full reboot fixed.
2
tempodox 1 hour ago 0 replies      
LOL, this game should be given to everyone who complains about how complicated math is. Should cure them in minutes :)
3
unnikked 1 hour ago 0 replies      
There will be an Android version? It is so addictive :)
4
readerrrr 6 hours ago 0 replies      
Thank you! Game of the Year!

I wish I had this in school. This might just spark a new interest in pure math for me.

...

I think I broke it: y = x^2 / ( 6000/t^(t*t) )

5
tophattom 15 hours ago 1 reply      
I'm not able to change the function on the second level. The input field is not responding.

Seems like a great concept!

6
kaoD 16 hours ago 0 replies      
Brilliant.

I always learnt more by doing. A few years ago I came across ARCalc[0] and playing with it really made me grok functions. Now SineRider gamifyies the exploration of functions. Neat!

[0] http://www.pouet.net/prod.php?which=30468

7
_sigma 15 hours ago 1 reply      
On [1] I'm not able to load anything. I just see the header text and an empty black square.Chrome 38.0.2125.111 (64-bit) on Fedora 20.[1]http://sineridergame.com/SineRider.html
8
hrc2 5 hours ago 0 replies      
Hey, very cool. I'm working on a hobby project using Unity that also deals with rendering equations, and I'm curious, did you use LineRenderer to render the line?
9
mdturnerphys 9 hours ago 0 replies      
Small complaint: the number pad doesn't work for inputing numbers in Linux. Some other characters are inserted (e.g. '' for 2) which are invisible in the box but make the function invalid.

Thanks for the fun!

10
kaoD 14 hours ago 1 reply      
I've hit a bug on "Waves - Order Still Matters!" where the graph is affected by zoom and object positions. Sorry about the short report, but it's really hard to explain, it just happens.
11
LukeB_UK 14 hours ago 1 reply      
This looks awesome! I remember playing line rider when I was in school!
12
TheSoftwareGuy 15 hours ago 1 reply      
A great concept, but the tutorial portion is just soooo long.
13
hughes 9 hours ago 0 replies      
Doesn't load! Uncaught Error: Bootstrap's JavaScript requires jQuery
14
vxNsr 8 hours ago 0 replies      
This is pretty cool, thanks for sharing!
15
julianz 15 hours ago 0 replies      
Very cool, it really teaches the nuts and bolts of how functions map to graphs. Love it.
16
dang 16 hours ago 0 replies      
We added "Show HN" to the title because it looks like a perfect candidate for that. If you don't want it there please let us know.
26
Getting Better at Getting Better
198 points by juanplusjuan  21 hours ago   63 comments top 12
1
tokenadult 17 hours ago 2 replies      
This is a very good article. The reference to the newly published book Faster, Higher, Stronger: How Sports Science Is Creating a New Generation of Superathletes--and What We Can Learn from Them by Mark McClusky prompted me to request that book from my friendly public library. I like how the article looks at the absolute skill levels among professional competitors in chess and professional performers in orchestral music and shows that the skill level in those and many other domains has been steadily rising in my lifetime. There is still a lot of untapped potential in most individuals alive today that can be developed even at adult ages.

As the article reports, "What were seeing is, in part, the mainstreaming of excellent habits. In the late nineteen-fifties, Raymond Berry, the great wide receiver for the Baltimore Colts, was famous for his attention to detail and his obsessive approach to the game: he took copious notes, he ate well, he studied film of his opponents, he simulated entire games by himself, and so on. But, as the journalist Mark Bowden observed, Berry was considered an oddball. The golfer Ben Hogan, who was said to have 'invented practice,' stood out at a time when most pro golfers practiced occasionally, if at all. Today, practicing six to eight hours a day is just the price of admission on the P.G.A. Tour. Everyone works hard. Everyone is really good." This kind of cultural change can still go a lot further in a lot of fields on human performance. A culture of continual efforts at self-improvement has hardly even begun in many occupations.

The article's conclusion about improving the performance of elementary and secondary school teachers in the United States is thoughtful, and also refers to good new books, Building a Better Teacher by Elizabeth Green and The Teacher Wars by Dana Goldstein. Studies of educational effectiveness in the United States consistently show that the variance in teacher quality in any one school swamps the variance in school quality between one school and another, so any child in any school district is at risk of getting an ineffective teacher. (Although schools in poor neighborhoods of the United States, on the whole, have the greatest difficulty in hiring and retaining good teachers.) Anything that can help teachers learn to teach better before or after they began working in the classroom will have massive social benefits. An economist who has studied teacher effectiveness for years shows that the best teachers are almost literally worth their weight in gold, while the worst teachers have negative added value for their pupils.[1] Bringing a culture of continual self-improvement in America's schools is a project of crucial national importance.

[1] http://hanushek.stanford.edu/publications/valuing-teachers-h...

2
gumby 19 hours ago 4 replies      
I am uninterested in professional sport but read this anyway and was really struck by this line:

> "...historically, practice was ... not about mastering skills. People figured that either you had those skills or you didnt."

I suspect many people still feel this way. Those who keep trying to hone their skills are the fun ones to be with. Unfortunately they appear still to be in the minority.

3
mlucero 8 hours ago 2 replies      
This is a great article but I wish it would have also included advancement in area of steroids. We live in an era where drugs fuel a significant part of professional sports. Top tier athletes also have top tier drug regimens and their "doctors" have found ways to 'hack' the testing. It's an area that the media isn't open about discussing but it is there with the millions of dollars at stake.

I know I'm leaning heavy on the sports side of the article but this isn't all a result of refining the skills required for sport. They are also faster, stronger, and recover more quickly because of the drugs athletes take.

4
moab 17 hours ago 1 reply      
This distinction is already visible today in our field. Looking at my batch of CS grads, there's clear divide between people who took on riskier positions or joined startups and people who continued to do what they were already good at. A few years after graduation, and this divide is already fairly stark - with people in the former being exponentially better than when they left school, and people in the latter not growing significantly.

The best people in CS are no different than the best workers/athletes in any field. The challenge for the next few decades will be to see how we can improve the pedagogy at Universities to help people learn to learn better.

5
corysama 16 hours ago 0 replies      
Here's an example of a school applying Demming's/Toyota's/Lean's techniques to great success.

http://www.youtube.com/watch?v=VGZHQnuZXj8

6
amjaeger 19 hours ago 1 reply      
I didn't expect this to be an article about education. But it seems to make a lot of sense. I have always felt that bad teachers recognize that there is a problem with their classroom management, however the solutions they come up with aren't great. I also have seen that most bad teachers have common problems. Which would imply that with a pretty standard set of instructions a bad teacher could transform into something better with just a bit of work.
7
quickpost 18 hours ago 1 reply      
Reminds me a lot of the notion of Growth Mindset vs Fixed Mindset that Carol Dweck has popularized.
8
thewarrior 17 hours ago 8 replies      
Is there any place where I can find a programming coach ?
9
lipnitsk 7 hours ago 0 replies      
Since his name came up in the article, it is worth reading more about the ideas that W. Edwards Deming[1] lectured on.

https://en.wikipedia.org/wiki/W._Edwards_Deming

10
chrisduesing 19 hours ago 7 replies      
Does this imply we are headed towards a future where there isn't a 10x difference between the best programmers and the worst? Where someone comes up with repeatable training that can help programmers advance throughout their career?
11
hookey 18 hours ago 0 replies      
Too meta.
12
amelius 18 hours ago 1 reply      
And how would this apply to hacking? Or entrepreneurship?
27
Amazon Moves to Extend Cloud-Computing Dominance
22 points by selmnoo  8 hours ago   8 comments top 2
1
ghshephard 5 hours ago 2 replies      
Some parts of this article feel like the author doesn't have much experience in the field of cloud services. For example:

"What makes Amazon unique in the fight to own the computing cloud is what its not a traditional tech company with a long history of providing products and services to business customers. Finding a way to deliver services over the Internet that behave like databases and other traditional software products will be critical to keeping its lead because older tech companies like Microsoft are already capable of doing it."

I understand what the author was trying to get at - IBM/Microsoft/Oracle have 20+ years of providing Business Services, (Well, IBM is closer to 100) - And Amazon has only been providing cloud services for about 10 years - but what an incredible 10 years! Most people would suggest that Amazon is the market leader in providing these types of services to business, and that IBM/Microsoft are playing catch up.

A good counterexample is Salesforce - they've only been around for 15 years, but nobody would suggest they aren't a dominant player in their industry.

The reality is - when it comes to new and disruptive technologies, the innovator quite often becomes the dominant and trusted player much faster than in traditional (non disruptive) industries.

2
SixSigma 5 hours ago 1 reply      
> Amazon, has returned less in profits in its 17-year life as a public company than cloud competitors ... earn in a single quarter. If Wall Street grows tired of Amazons continued losses, it could also pinch A.W.S.

The share price two years ago was 250. Today 320. Ok the peak was 408 but that is still pretty good.

Over the 17 years it has gone from 1 to 320.

I'm still calling Hold / Buy

28
Lyra: An Interactive Visualization Design Environment
96 points by jcr  16 hours ago   11 comments top 7
1
danso 9 hours ago 2 replies      
How does this differ conceptually from Tableau? Yes, I'm asking this (annoying) question even though the OP states, "Lyra is more expressive than interactive systems like Tableau, allowing designers to create custom visualizations comparable to hand-coded visualizations built with D3 or Processing."

...yeah, but how exactly? Because it looks about as complicated of a GUI as Tableau...and I don't have enough knowledge of Tableau to compare it against the video, as Tableau's interface is so befuddling that I thank God I stumbled unto web development, as painful as that journey has been, so that I could code my own interactives rather than have to learn Tableau's conventions.

I guess the issue with Lyra is the same with all other programs that claim "custom visualization design without writing any code"...the two desired features, "custom" and "without writing any code"...are, IMHO, at odds with one another. If you want to do anything custom and interactive, you will pretty much have to do something as complicated as code...and pushing a series of buttons and clicking through menus may end up as being as intellectually challenging as just learning programming.

Also, I don't see how the Lyra visualizations are comparable to D3...D3 is amazing because it is a relatively minimalistic framework for coding visualizations...the kind of flexible, expressive visualizations you can do are possible because you are allowed to expressively code them via D3. I don't really see how Lyra (or any GUI) could accomplish that conceptual feat.

2
polskibus 3 hours ago 0 replies      
Great job! Impressive to see an open source project to pick a fight with established software packages like Tableau. I used many BI packages in my life (only few of them for longer than a while) and consider my self an intermediate D3 programmer.

Some feedback if you need it: the UI seems very slow on Chrome, and I don't find it intuitive - there's a lot of dragging and dropping from various places to establish a simple data vis. Ideally you'd like to see what is available, choose it and later play to tweak it, not having to do lots of configuration upfront. Perhaps you could default your tool to do a line chart (or some other kind of early visual feedback ) so the user knows he's on the right track from the beginning?

3
hitlin37 3 hours ago 0 replies      
I haven't tried this one, but did use Tableau (weird and difficult to spell) for a while. But till date, only tool i found useful is Statwing. Tableau is somewhat helpful with its multiple options but using it isn't intuitive at all. The thing i liked about Statwing is that just throw your data to it and start experimenting with different relations between data. I see Statwing as the first level of analysis and then move to more sophisticated viz in d3 as next step. But i'm interested in this area and using more of such tools to get the viz right.
4
AustinBGibbons 16 hours ago 0 replies      
I've been tracking this for awhile, the design potential is fantastic. I'm really excited for when it can be used to quickly create dashboards with streaming updated data.
5
njx 14 hours ago 0 replies      
something similar but does dashboards+ https://my.infocaptor.com/free_data_visualization.php
6
Li_spallation 4 hours ago 0 replies      
Free software built for astronomy that appears to do what Tableau (and to some extent Lyra) does, without the gloss.

http://www.star.bris.ac.uk/~mbt/topcat/

7
azeirah 14 hours ago 1 reply      
Ah, I love seeing programs inspired by Bret Victor's talks and demos!
29
John Perry Barlow Online privacy double agent
21 points by denzil_correa  7 hours ago   4 comments top 2
1
pmoriarty 6 hours ago 1 reply      
Barlow seems to agree with David Brin's thesis in "Transparent Society"[1] that the inevitable loss of privacy of ordinary citizens will correspond with an increase in transparency of governments, the powerful, and the aparatus they create.

This is dangerously naive optimism. The asymmetry in power and resources ensures that no such balance in transparency is possible. Ordinary citizens do not have gigantic datacenters and armies of mathematicians, spies, and computer scientists at their beck and call to monitor the government with.

That's not to say there hasn't been some notable progress with respect to government transparency, but they are a few fireflies when compared to the floodlights that the powers that be have at their disposal.

[1] - https://en.wikipedia.org/wiki/The_Transparent_Society

2
xnull2guest 6 hours ago 1 reply      
I don't really get a sense from this article that Barlow is a double agent.
30
The Beauty of LaTeX (2011)
175 points by pmoriarty  17 hours ago   120 comments top 15
1
boshie 15 hours ago 1 reply      
A very useful tool for writing LaTeX when you are often switching locations is ShareLatex. You can write LaTeX in your browser. It allows collaboration too, similar to Google Docs.

https://www.sharelatex.com?r=890185d4&rm=d&rs=b ( <- referral link, a referred user enables me to add more collaborators to my projects. Here's a non referral URL: https://www.sharelatex.com)

2
riobard 16 hours ago 8 replies      
(La)TeX the typesetting engine is great, but (La)TeX the language is clunky. Without proper support for namespace, it's rather difficult to abstract and write macros without worrying about side effects. I have real troubles to guess which package a command is from.

Are there any alternatives that have solved this problem?

3
ginko 2 hours ago 1 reply      
The thing that makes me wonder is that all these typographical features should be easily supported by a WYSIWYG editor. From my perspective all it would take would be for Microsoft to buy some professional grade fonts and to put a couple of developers on the problem.None of this seems like it's inherently incompatible with the WYSIWYG workflow.

Am I overlooking something here?

4
amelius 3 hours ago 1 reply      
My main problem with LaTex: it is not composable. Explanation: you can't plug a random piece of LaTeX inside a random container (environment or macro), without it being formatted completely wrong, or without it triggering all kinds of error messages.

Composing objects hierarchically should be natural (easy), and the user should not be required to read tons of documentation for each case.

I think HTML does a lot better in that respect (though it has its flaws too).

5
chronial 15 hours ago 1 reply      
It should be noted that word does indeed support all ligatures and glyph variants since Office 2013. Transparent text is also possible, put I guess it already was back then.
6
Argorak 16 hours ago 2 replies      
I would also like to encourage you to have a look at ConTeXt. It goes the other way for LaTeX and wants to think you about layout more. I use it to create all my slides and have far more fun than with LaTeX beamer for that.
7
bbcbasic 16 hours ago 3 replies      
I would like to start using LaTeX actually. I like that LaTex would work will with source control, and that you can use a simple editor like Vim to edit.

I hate using word and have a load of random formatting applied to my document. I am fed up with using the 'format painter'!!!

8
rspeer 15 hours ago 2 replies      
This article claims that "LaTeX supports Unicode". Does it now? Can you just drop in encoded characters from any language your fonts support, and get them rendered correctly? That would be a huge breakthrough, and last I knew, this is not at all the case.

LaTeX supports rendering various kinds of diacritics and math symbols, through its own mechanisms that aren't Unicode. If you want Unicode, you need to use a separate project called XeTeX. XeTeX's home page [1] introduces it as "Unicode-based TeX".

[1] http://xetex.sourceforge.net/

9
honorious 14 hours ago 2 replies      
I have been using LaTex for many years and I like it, but what's stopping other document editing software (like Word) to reach the same level of quality of results? LaTex has been around forever, but nobody else is close to that quality.
10
baddox 16 hours ago 4 replies      
> Common ligatures are essential to professionally typeset text.

I like the idea, and they're aesthetically great, but I think they're too rare now for it to be wise to use them. People will be confused and distracted. I have only one anecdatum, which is that Slack chat uses a font with ligatures, and I have heard several confused comments about them. For niche professional documents, like research papers, I'm sure it's accepted and expected, but you're probably already using LaTeX for those anyway!

11
emersonrsantos 16 hours ago 1 reply      
You can have the best of both worlds (TeX and WYSIWYG) with LyX and possibly other editors that I didn't use much (Maple used to render to LaTeX 20 years ago and it worked kinda nice).
12
paulgerhardt 16 hours ago 0 replies      
Is there a current goto standard to get this functionality in the web browser? A javascript framework that supports ligatures, river elimination, dynamic hyphenation and so on?
13
quink 16 hours ago 1 reply      
Year is wrong, BTW.

I remember reading this a few years before 2011.

Edit: here we go, https://news.ycombinator.com/item?id=1173226

14
dendory 14 hours ago 0 replies      
I wrote a quick tutorial to LaTeX a few weeks back for those who might be interested in starting using it. http://dendory.net/?w=544fb21d
15
jvehent 16 hours ago 4 replies      
I prefer HTML & CSS, maybe with something like restructuredText on top.
       cached 17 November 2014 14:02:04 GMT